In the news:
State schools fail to keep eye on potential test cheating
Could Washington have the kind of test-cheating scandals that have occurred in Atlanta and other places? If it did, we might not know, because the state takes a passive role, employing none of the post-test analyses that many other states use to detect wrongdoing.
Seattle Times education reporter
Each school year, Washington education officials receive, at most, a handful of allegations that teachers cheated to inflate student scores on state tests. Many years, they receive none at all.
Officials attribute that record to how well they train school-district staff in test security, and how those people, in turn, train their teachers and principals. They also praise the honesty of the teachers and other staff members who proctor the dozens of state-required tests that Washington students take each year.
But test tampering could be a bigger problem than they realize. When it comes to uncovering cheating, Washington state takes a passive role, employing none of the post-test analyses that many other states routinely use to detect wrongdoing.
The state Office of Superintendent of Public Instruction (OSPI) does not pay the state’s testing contractor to look for erasure patterns on student answer sheets that suggest someone changed wrong answers to right ones after the test. That’s what dozens of teachers and principals in Atlanta are charged with doing, and some have confessed.
Nor does the superintendent’s office look for improbably high gains in a school or district’s scores or other suspicious results, such as a class full of students with identical answers to many questions.
The office does examine statewide scores to see if anything looks amiss. It also works to prevent cheating in the first place, which testing experts also stress as a key part of security.
Yet despite the growing number of cheating scandals across the nation — and the increasing importance placed on test scores — the office’s assessment staff don’t take the next step and analyze individual school-district and school scores, too.
Officials say they don’t need to spend time and money on such post-test analyses because they’re confident suspicious activity would be reported to them.
“Based on how closely we monitor and work with the school districts, we’ve not found any cause to ask for the additional funds,” said Christopher Hanczrik, OSPI’s director of assessment operations.
But national testing experts say relying on whistle-blowers and on school districts to police themselves is inadequate, especially as many states, including Washington, start to use test scores in evaluating teachers and principals.
In the past year, two prestigious national organizations have recommended that post-test analysis be a regular part of any state testing program.
Jim Wollack, a test-security expert at the University of Wisconsin, said states that forgo such analysis of school and district scores are blind to any misconduct happening at those levels.
Greg Cizek, a professor of educational measurement and evaluation at the University of North Carolina-Chapel Hill, called such states irresponsible.
“A state’s primary obligation is to ensure that the scores reported on their tests are valid,” he said, “and if you don’t do any kind of post-test analysis, you can’t affirm that the scores are valid.”
Like Washington, the state of Georgia also didn’t regularly look for suspicious erasure patterns until 2008, when The Atlanta Journal Constitution raised questions about a number of schools where test scores had increased too much to be believed.
In one school, a large number of students who had failed the state test in the spring earned very high scores on retests in the summer.
Subsequent investigations have since spread to hundreds of schools across Georgia, leading to indictments earlier this year of nearly three dozen Atlanta educators who, under pressure to raise test scores, allegedly changed answers for years before they were caught.
Former Superintendent Beverly Hall was indicted, too. Hall, who resigned after the scandal in 2011, had offered cash bonuses at schools that met achievement targets.
Since 2008, cheating scandals have also emerged in Washington, D.C., Pennsylvania, Ohio and a number of other places, prompting calls for stricter test security, including from the national organization that represents school-testing professionals, where Cizek and Wollack are board members, and the organization that represents state education departments.
Most states already do post-test analysis.
In a survey done earlier this year by the Government Accountability Office, 37 reported doing one kind or another. Thirty-three said they investigate erasure patterns, and 28 reported looking for unusual score gains and losses.
Fewer and fewer states, Cizek said, are in Washington’s camp — relying only on prevention efforts and whistle-blowers.
Cheating may still be rare, he said, involving at most 5 percent of classrooms across the nation, but it’s still a problem.
“In every case we’ve seen when states or districts think there is no cheating going on,” Cizek said, “they are surprised to find out there is much more than they ever believed there was.”
Washington seems to do a good job when it comes to prevention. It follows many of the national recommendations in that arena, such as providing annual training for school-district staff, and requiring anyone who touches a state test to sign a statement saying he or she knows the rules.
OSPI has strict procedures for everything from how to lock up exams before and after they’re given, what teachers can say and do during testing, even what should happen if a student needs to use the bathroom.
The state asks school and district staff to report any irregularities — inadvertent mistakes, as well as suspected tampering. It also has a hotline for anonymous reports.
That helps. The state receives hundreds of reports each year about testing irregularities and this year, an Olympia science teacher turned himself in for using a photocopy of last year’s fifth-grade science test to help prepare students for this year’s exam. OSPI invalidated the scores of all 165 of his students.
But the state doesn’t look for cases like that on its own. Even when suspected cheating is reported, OSPI largely turns to districts to investigate themselves.
Hanczrik cites cost as another reason why OSPI doesn’t do more, saying state legislators have cut the state budget dramatically, and his office has just three people.
“It’s not being irresponsible,” he said, “it’s making the best choice we can, with the available resources.”
He also suggested that the cost of post-test analysis was not justified, comparing it to an individual paying for an MRI every year just to rule out the possibility of a serious illness.
But experts such as Cizek argue that such reviews are much cheaper than the kind of investigations going on in Atlanta, and that the price tag is small compared with the overall cost of testing.
In Washington, for example, the state pays $30 million a year to the contractor that develops and scores state tests.
An erasure analysis, Hanczrik said, would cost $100,000 more.
The state superintendent’s office may add some post-test analysis in the next few years, Hanczrik said, especially as computer-based testing makes it much cheaper and easier to do.
But he also said those discussions have yet to start, even though more than half the state’s schools already give state exams online.
In defending the lack of post-analysis, Hanczrik also argued that Washington’s tests for students in grades 3 through 8 aren’t high stakes, so there is little incentive to cheat.
Cizek and Wollack dismissed that notion, saying that, under the federal No Child Left Behind Act, schools can face sanctions for chronically low scores, which can include removing the principal and staff.
“In every state across the country, the K-12 assessment has high stakes in it for someone,” Cizek said.
What to watch
At a minimum, Cizek and Wollack said, states should do erasure analysis on paper-and-pencil tests, and look for big jumps in scores.
For online tests, Wollack said, states should look for fast response times, an indication that students may have been coached on a particular question or questions.
Cizek said states also should look for changes in performance over time — to see if an item that students struggled with one year suddenly is easy the next.
In this state, however, OSPI hasn’t looked into every potential cheating problem that others have pointed out.
Last year, four Washington school districts ended up on a list published by The Atlanta Journal Constitution which, after the cheating scandal in that city, reviewed test results across the nation, looking for other suspicious patterns.
What did OSPI staff do?
Nothing, Hanczrik said. His office left it up to those school districts to determine if cheating had occurred. No one examined the data, or made a single phone call.
Linda Shaw: 206-464-2359 or firstname.lastname@example.org. On Twitter @LShawST