Clear68° FULL FORECASTClear68° FULL FORECAST
Make us your home page
Instagram

Gradebook

Education news and notes from Tampa Bay and Florida

FCAT writing test scores lower because of tougher standards

The Buros Center for Testing, first brought in to Florida to review the state's botched 2006 FCAT third-grade reading test, has weighed in on this year's steep decline in FCAT writing scores.

Its conclusion? More kids fared poorly because the State Board of Education made the test harder to pass. Just like commissioner Gerard Robinson has said all along. A key excerpt:

What prompted the dramatic drop in students’ proficiency rates this year? There is ultimately no way to answer this question with factual accuracy, and hypotheses are educated speculations. A few of the possible reasonings follow.

1. The essay prompts were simply more difficult.

2. The more rigorous scoring criteria implemented by the Department of Education in the attempt to increase standards affected the scores so that lower scores resulted.

3.  Because the scoring standards were increased in 2012, the essays selected to be the anchor, training, and validity papers were assigned scores that were somewhat lower than what they would have been under the previous writing standards, and the operational scoring of the student essays duplicated this more stringent essay scoring.

4. The actual writing of students declined from one year to the next.

Of the hypotheses above, only the first and last ones can be largely discounted. Addressing the fourth, it is simply improbable that writing instruction and student quality across the state would have declined to such an extent across a single year. Without a catastrophe having occurred to the State and its educational system, such an explanation would defy logic and experience. Similarly, the essay  prompts had been pretested during earlier testing years and had been found to generate score profiles similar to previous prompts. Thus, the first hypothesis can also be largely discounted.

The independent group, based at the University of Nebraska-Lincoln, found that the test scorers were well trained and that the state's new, more rigorous standards were appropriately applied. It did state, though, that the FLDOE might want to reconsider its reuse of scorer training materials, so that candidates have to do more than recall the correct score for an essay rather than assign their own judgment. Buros also reminded that scoring essays is not an exact science.

[Last modified: Tuesday, June 5, 2012 3:04pm]

    

Join the discussion: Click to view comments, add yours

Loading...