Clear64° FULL FORECASTClear64° FULL FORECAST
Make us your home page
Instagram

Gradebook

Education news and notes from Tampa Bay and Florida

A weekend interview with Sharon Koon, Florida assistant deputy commissioner of assessment

7

April

The Florida Department of Education this year has added a new set of rules to its FCAT administration procedures, all related to possible cheating on the annual high-stakes exam. These include having students sign a pledge they won't cheat, and getting teachers to keep detailed seating charts from the testing rooms. Dr. Sharon Koon, assistant deputy commissioner of accountability, research and measurement spoke with reporter Jeff Solochek about the reasons behind this heightened sensitivity to the potential for misbehavior on the FCAT.

Why did we put some of these rules into place now?

The primary reason is we want to, just like many other states and nationally, ensure the integrity of the data that is used to measure student achievement and to ensure meaningful education accountability. Those are really the primary reasons why we're putting some of the new measures in place. We've always had a lot of measures in place. We just added a few this year to really complement the measures. For instance, the addition of the student pledge really complements the move to statistical analyses, and having that systematic look at similarity of student responses. While we're doing the statistics we need to also make sure that students understand we are using the statistics. So some of these work hand in hand, and then others really have been in place for a long time. We just worked to make them a little more specific.

Let's talk about the student pledge. Is it something that you're looking at as being binding in a way. You could say, the students signed this. They know they could be held accountable if they're cheating. Or is it just an awareness thing?

It's really an awareness. It's not a legal binding agreement. But we want students to make sure that they understand what cheating is. Many students understand that looking at someone else's paper is cheating. But they don't understand that allowing someone else to look at their paper is cheating as well. So I think it's just an awareness piece. And also to let them know that if they do this, there are consequences. ...

Don't you think that's already in play? You don't think that kids already know that? I would imagine this would make them more nervous. They know they're not supposed to cheat. But this is more like, You better beware.

I don't know. You've got students that maybe they feel pressured to allow another student to look at their paper. And maybe in some cases this might be helpful to a student who might be unaware of how to handle this type of situation. The pledge allows them the extra support to not be afraid to cover their paper. Maybe some kids, it might make them feel nervous about the consequences. But I would think the awareness of the consequences ahead of time would outweigh and make the administration more secure. ... 

What about the seating chart. Can you explain the reason for the seating chart, including, I think it said, which direction the students are facing?

Sure. Last year we implemented the new Caveon analyses looking at similarities of responses. And when students were flagged we asked the districts to conduct investigations. ... What we found in many cases was that accurate seating charts weren't being kept, because we didn't require them. And when we went back to look at where students were seated we found in some cases, as I mentioned, accurate seating charts weren't kept. And in some cases when a student finished testing in a room and then needed extended time in another room, it wasn't clear in all cases where they were sitting at each step of the way. So when we're running these analyses and asking districts to research the flagging of these students, it's important to know exactly where they were seated, which direction they were seated in, were they seated there the whole time. ... It's really just a way to make sure we have accurate information on that testing environment, throughout testing, so that if a student is flagged for having a very similar answer pattern to another student, we know where they were seated.

Is that similar rationale for requiring teacher ID numbers on test packets and things like that, being able to track who was in the room and who came in contact with what?

Correct. We always want to know who came into the room when testing was occurring. That goes along with the chain of custody of materials. We want to make sure we know who has access to these materials at all times. And in terms of the group of students that is being tested, we want to know who the test administrator is. ... In some cases in the past those records haven't been kept as accurately as we would like. And, yes, it is consistent. ... These are all things we would want to look at if a school or student was flagged for any type of irregularity.

Why so much concern about cheating these days? Is that really the big issue? I recall very few instances in the past several years. ...

Florida has always, always looked at cheating and looked at trying to identify irregularities. When we issued our RFP ... part of that RFP was to have a systematic review of all of our test results to, again, ensure the integrity of our data. So it's not new. It's not a new focus. It's just that now we have more systematic ways to look at all of our data that we didn't have in the past. With these new ways we need to actually have the supports in place that we can then use as pieces of evidence when something occurs. So it's not new to us. It's not a new focus. It just seems like it because it's a national issue. But in Florida it's not new. It's just we're getting better at making sure our results are meaningful.

When you say the results are meaningful, I look at that and wonder, meaningful to who?

As I mentioned, we are making student achievement decisions about whether students are able to move successfully on to more rigorous content. That's our primary concern. All of these efforts are about increasing student achievement and making sure that we have accurate measures of their abilities. It's meaningful for that purpose. And all of the accountability that goes with it is really for that same purpose. ... 

Is it going to apply also to the end of course exams?

Yes, we do run these analyses on the end of course exams. Of course, they're computer based, so we don't run our erasure analysese on computer based assessments.

Is this part of the reason we've been getting our scores later?

... It really doesn't add any extra time necessarily. Because while Caveon is doing their analyses, Pearson is doing other analyses and then they merge together. I really believe that we're doing a really good job of getting our results back quickly. ... 

What do you say to the teachers and other people who say all this focus on test cheating is insulting? Why would you think that we cheat?

Again, our focus is on ensuring the integrity of the data. I would think that most teachers would support that emphasis so that all of the results are fair and able to be compared consistently.

Are there any other things that are being added now?

At this time I can't think of anything. But as we continue to learn and hear about new ways of looking at things, we will always consider those. We annually have a n assessment debrief meeting with our district coordinators of assessment, and we learn about how the administration went in the spring and any concerns they may have. ... We're always looking to improve. But at this point in time we don't have anything on our list right now.

What about the students who are taking the tests on computers. Most of the schools I have seen don't have enough computers to have everyone take the test at the same time. How do you prevent cheating of the kind, Hey, I just took the FCAT. Watch out for Number 5.

Well, we do have different test forms. That's the one good thing. This year we have four different test forms. There's always going to be that element, especially in this day and age when people are posting things on Facebook. But because we have different test forms, and because we're trying to structure an administration window where districts are trying to use the fewest days possible to administer a computer based test, we're hopeful that's going to be minimized. We're doing the best we can. I think we'll continue to work on strategies to minimize the exposure. But there's always going to be someone who remembers and item and tells their best friend. ...

Do you have any advice for the kids who are getting ready to take the exam as they enter the room, to make sure they do and don't do certain things?

I think they just need to do the best they can. The best strategy for getting ready for the exam is to focus on the standards. Obviously, they're not focusing on the standards, but their teachers have spent the whole year teaching the standards, and our assessments are aligned to those standards. So there shouldn't be any extra effort other than to do their best. One thing that I would like to mention, just a practical piece of advice, would be that they don't have electronic devices on them. That is a very common reason for having a test invalidated, and it's very avoidable.

[Last modified: Sunday, April 8, 2012 8:42am]

    

Join the discussion: Click to view comments, add yours

Loading...