It's FCAT Writing time. Do you know who's scoring your child's exam?
As fourth, eighth and tenth graders across Florida write their hearts out (really?) for the FCAT, we thought it appropriate to consider who will evaluate their work and decide whether they're meeting grade level expectations.
No, it's not people who answer a Craigslist ad (at least not anymore). But is it much better than that?
Not according to the tales of former scorers for companies including Pearson, as reported last week in the Minneapolis City Pages: "Now scorers from local companies are drawing back the curtain on the clandestine business of grading student essays, a process they say goes too fast; relies on cheap, inexperienced labor; and does not accurately assess student learning."
Florida lately has gone from two scorers per essay to just one, although 20 percent will be read by two for "quality assurance purposes." The state has required that all scorers hold at least a bachelor’s degree in English or a related field, and complete intensive training sessions and pass all qualification examinations.
But who are they? The Florida Department of Education doesn't know.
"The scorers are Pearson employees," DOE spokesman Tom Butler told the Gradebook, noting the company advertises, recruits and hires all the scoring candidates.
Just by way of recollection, Pearson is the company that paid nearly $15 million in fines to the state last year for failure to score FCAT exams on time, with superintendents questioning the accuracy of the results. More recently, it sent out FCAT Writing exams without proper cover sheets to several school districts.
Back in 2000, the company made headlines in Minnesota for misgrading 45,739 graduation tests, leading to a lawsuit with a $11 million settlement.
Here's more from the FLDOE on how it tries to ensure that the scorers are doing their job:
"How is the accuracy of scoring determined?
- Supervisors read behind all scorers throughout the scoring session. This is called backreading, and it is done throughout the scoring session to identify scorers who may need additional training and monitoring.
- If any essays are scored incorrectly, supervisors review the essays and scores with the scorer, and then provide guidance on how to score more accurately.
- Essays with scores previously established by Florida educators on the FCAT Rangefinder and Rangefinder Review Committees are embedded in the flow of student essays that are scored.
- When scorers score these embedded essays (validity papers), their scores are compared to the scores established by the Rangefinder committees. The results are compiled as validity reports and reviewed by Scoring Directors and DOE staff throughout the day and scoring sessions.
- Analysis of the validity reports allows Scoring Directors to determine which essays are most often scored incorrectly and which scorers are most often in disagreement with the established scores in order to target retraining sessions as needed. This also helps to identify which already-scored papers must be rescored.
What happens if a scorer does not demonstrate accuracy?
- Retraining is conducted for any scorer whose performance falls below acceptable standards.
- If retraining is unsuccessful, the scorer is dismissed from the program. Scores that they have submitted to date are deleted from the system and these essays are rescored."