To believe in the fairness and accuracy of the state's new teacher evaluations, you'd have to believe that more than two-thirds of the teachers at Springstead High School — 69 of them — are better than the very best teacher at either Nature Coast Technical or Central high schools.
You'd also have to believe that only a handful of teachers in the entire district "need improvement." (As a parent, I can name at least a half-dozen.)
You'd have to accept that teachers can be graded on the results of test scores in subjects they don't teach.
And you'd have to have faith in the state Department of Education to manage a numbingly complex system after it got one of the most basic facts in its first evaluation report dead wrong; initially it overcounted the number of public teachers in the state by tens of thousands.
One other thing: If you're tempted to believe the best news to come out of the report — that Hernando's teachers stack up quite well — don't.
Different counties have different ways of grading teachers, so the high percentage of educators in the highest categories here, compared to nearby counties, is just about meaningless.
So is, as should be clear by now, almost all of last week's report, partly based on the "value added" evaluation model.
And that's too bad.
The system may be a bit test-heavy for the likes of Hernando Classroom Teachers Association president Joe Vitalo, but even unions accept the need for more accountability. And considering that the value-added idea was part of President Obama's Race to the Top initiative, it's got some bipartisan support.
Personally, having seen local schools' performance generally rise in the era of high-stakes testing in Florida, having had the misfortune to be educated in the 1970s, when the best evaluations might go to the best buddies of the principals, and having spent a lot of time sitting on bean bag chairs reading magazines, "learning at my own speed," I'm all for accountability. And, yes, for testing.
At least if it's as focused as the value-added model, which would make up a large portion of teachers' evaluations. The idea is that teachers are judged by gains in learning, as measured by year-end tests, in the subjects they teach.
So how did the state go wrong?
It applied the model to too many subjects. As well as it might work for reading, math and science, it's hard to see how it can be used for judging the performance of art and music teachers — yet it is.
Worse, it was rushed into place in 2011, during the first legislative session under Gov. Rick Scott.
Maybe Scott and the Republican lawmakers who pushed the bill just wanted to get on the forefront of the teacher accountability movement. Maybe, like Republicans in several other states that year, they were taking a poke at a Democratic stronghold — teachers unions.
It probably also didn't hurt that the well-connected companies producing year-end tests and the materials needed to prepare for them stand to make a bundle of money — much of which, by the way, will come from individual districts rather than the state.
Whatever the reasons, this is the result of all that haste:
Most districts have hardly any of the needed year-end tests available yet and had to rely on Florida Comprehensive Assessment Test for the evaluations.
That means average teachers at a high-performing school such as Springstead can end up looking like stars. It also means that a science teacher might be judged on the work of the reading teacher across the hall.
And though it will be a few years before the ratings help determine teachers' pay — one of the main goals of the program — this year's results will go in their files.
They could even lose their jobs if, for this year and two more, they are judged to "need improvement."
There might not be too many Hernando teachers that fall into that category, but this program definitely does.