Partly Cloudy76° WeatherPartly Cloudy76° Weather

Gradebook

Education news and notes from Tampa Bay and Florida

Florida education department responds to FCAT science concerns

20

April

Earlier this week, north Florida's Happy Scientist, Robert Krampf, wrote a stinging blog post criticizing Florida's science FCAT materials as just plain wrong. 

Winter Garden resident Bill Sawyer read the post and fired off an email to the state asking officials to resolve the problems without delay.

Sharon Koon, assistant deputy commissioner for assessment, replied that the department is taking Krampf's comments into account as it revises the FCAT specifications for science. She also addressed some of Krampf's specific criticisms:

"There were two items mentioned by Mr. Kampf, in addition to several definitions.  In response to his comments, I provide the following.  Sample item 2 was included in the Specifications as an illustration of Benchmark SC.5.N.1.6, “Recognize and explain the difference between personal opinion/interpretation and verified observation.”  The sample item was written to align to that benchmark, with the intention of contrasting the ability to verify a count of birds against a more subjective rating of whether something was softer, prettier, or smelled sweeter.  While the sample item could have been written more clearly, the intention was to demonstrate the format of a sample item that could be used to measure that specific benchmark.  Separate from this sample item, there are very specific guidelines given to item writers that address the quality of items, including that all developed test items should have only one correct answer.  These guidelines are described in the beginning sections of the Specifications.  Similarly, the diagram included with sample item 7 could have more clearly identified the surface of the scratch plate.  Again, there are specific guidelines about graphics that are used with test items."

Sawyer responded that he was satisfied that it looked like the state will resolve the problems. The department forwarded the e-mail chain to the Gradebook to let us (and therefore you) know the issue is not being ignored. Read on for the full set of emails.

Initial e-mail

Dr. Sharon Koon:

It has come to my attention through a very well-written blog post by Robert Kampf, posted at http://thehappyscientist.com/blog/problems-floridas-science-fcat-test, that there are issues with this document. As a concerned, informed, and well-educated citizen, I have also reviewed the document in question, and I agree with Mr. Kampf that these issues demand resolution.

In the same blog post, Mr. Kampf notes that he requested information on the Content Advisory Committee members, and was advised by Mr. Steve Ash that such information would not be given out. I do not think I should have to remind you, but under the Florida Sunshine law (http://myflsunshine.com/) and as described in the Sunshine Manual (http://www.myflsunshine.com/sun.nsf/sunmanual), you cannot refuse to disclose such information.

So, I will ask this through this informal channel in the hopes that it can be resolved without further delay. Please send me the names and contact information for the committee members that developed the content for the FCAT 2.0 Science Test Item Specifications - Grade 5.

Regards, Bill Sawyer, Winter Garden, FL

Response

Dear Mr. Sawyer:

The FCAT 2.0 Science Test Item Specifications document serves as a resource that defines the content and format of the test and test items for item writers and reviewers.  It also serves to provide all stakeholders with information about the scope and function of FCAT 2.0 Science.  It was written at the beginning of the development of FCAT 2.0 Science in 2010.  The sample items in the Specifications are not intended to serve as current practice items for students.  The Department develops sample test items for this purpose. 

There were two items mentioned by Mr. Kampf, in addition to several definitions.  In response to his comments, I provide the following.  Sample item 2 was included in the Specifications as an illustration of Benchmark SC.5.N.1.6, “Recognize and explain the difference between personal opinion/interpretation and verified observation.”  The sample item was written to align to that benchmark, with the intention of contrasting the ability to verify a count of birds against a more subjective rating of whether something was softer, prettier, or smelled sweeter.  While the sample item could have been written more clearly, the intention was to demonstrate the format of a sample item that could be used to measure that specific benchmark. Separate from this sample item, there are very specific guidelines given to item writers that address the quality of items, including that all developed test items should have only one correct answer.  These guidelines are described in the beginning sections of the Specifications.  Similarly, the diagram included with sample item 7 could have more clearly identified the surface of the scratch plate.  Again, there are specific guidelines about graphics that are used with test items. 

Items that are placed on FCAT 2.0 Science go through a rigorous process that begins with item development based on the Specifications, but then proceeds through item reviews (including bias, sensitivity, and expert review), field testing, and then statistical reviews based on the field-test results.  Due to the theory-based nature of the content area, all potential science test items undergo an extra level of scrutiny. Participants on the science expert review committee examine newly developed science test items to ensure the accuracy and currency of the science content. Participants include practicing scientists from the private sector and university-level science researchers and faculty.

During the review of field-test statistics, student responses are reviewed to ensure that there is no evidence of an item having multiple correct options, despite all of the reviews conducted prior to field testing.  If an item is found to have evidence of multiple correct options, it is discarded or revised and field tested again.  A more detailed description of this process can be found in the FCAT Handbook, located at: http://fcat.fldoe.org/handbk/fcathandbook.asp.  Evidence of these quality control measures can be found by reviewing some of the FCAT Science tests that have been released in the past at: http://fcat.fldoe.org/fcatrelease.asp. 

The Department's Test Development Center did receive feedback from Mr. Kampf and many of his comments have contributed to the revisions that are currently being made as a part of the scheduled update to the Specifications document.  

Lastly, you requested a list of the committee members who participated in the development of the current version of the Specifications.  Attached, please find that list.

Thank you for being involved in Florida education.  I hope this information addresses your concerns.

Sincerely, Sharon Koon, Ph.D., Assistant Deputy Commissioner

Response

Dear Dr. Koon: Thank You! This is exactly the kind of response I consider very appropriate and very professional. You have established an admiral leadership position with this thoughtful response. While I have serious concerns over the very validity of using high stakes testing, like the FCAT, for evaluating schools, teachers, and students, I do believe that while we are using the FCAT, or similar tests, we must have the absolute best preparatory materials that can be had. I applaud Mr. Kampf for becoming actively involved. And, I applaud you for taking the leadership position to respond appropriately to the issues he raised. Your response gives me a much better sense that ultimately these problems will be resolved. With my Deepest Thanks, Bill Sawyer Winter Garden, FL

[Last modified: Thursday, April 26, 2012 11:26am]

    

Join the discussion: Click to view comments, add yours

Loading...