Not every school district with big jumps in test scores is cheating.
That’s the word from the Iowa Department of Education regarding implications about two school districts in Iowa: Ottumwa and Muscatine.
The finger-pointing is from a major American newspaper, the Atlanta Journal-Constitution (AJ-C), which suggested that 200 school districts across the country, including Ottumwa, have standardized test scores abnormal enough to justify a closer look by officials. They repeatedly wrote, however, that the deviations in year-to-year average scores do not indicate cheating.
But the Atlanta paper said those changes in score resemble early indicators (see sidebar) in Atlanta that ultimately led to “the biggest cheating scandal in American history.”
They look at the percentage of scores outside the exptected norms. Five percent outside expected norms is not too bad, they said, but deviations over 10 percent catch the investigator’s eye.
Concerned readers contacted the Courier after seeing the Atlanta Journal-Constitution’s mention of the Ottumwa school district.
Iowa’s DOE was unaware of the AJ-C study until contacted by the Courier.
“Any time there’s questions about the integrity of tests, that’s something we would be concerned about,” said Staci Hupp, spokesperson for the Iowa Department of Education.
Leaders and analysts there reviewed how the study was done. While they reviewed the data, the Courier asked Ottumwa Superintendent Davis Eidahl about his thoughts on the AJ-C study.
“I am thrilled that out of the whole United States, we had test scores high enough to alarm a newspaper in Georgia,” Eidahl said. “These scores show we are getting results, not only in Ottumwa, but in all of Iowa.”
He’s confident there is no cheating. As they put new teaching methods into place — methods proven to increase student achievement, including test scores — administrators expected to see jumps in test scores.
“This is just one of the assessment tools we use,” he said, adding that Ottumwa students take other standardized tests which aren’t reported to the state, and that those also show improved scores.
All assessments are used to guide teaching by showing where students are weak or strong. Faking the scores would be counterproductive, he said.
“We take extreme measures to ensure the validity of the results,” Eidahl said. “Every administrator goes through the training for testing ethics and administration. We walk them through all the do’s and dont’s, even identifying shady practices [considered gray areas] to stay away from. We don’t want to cross that line.”
The Iowa Test of Basic Skills is protected, he said.
“With the Iowa test, the publisher keeps a very tight eye on those booklets,” Eidahl said.
But there are other ways to skew results that don’t involve blatantly erasing wrong answers — like “teaching to the test.” Teachers get the answers and tell the kids.
Another way isn’t considered cheating: teaching only subject matter the district believes will be on the test.
Since the test is supposed to reflect what is being taught in the classroom, it can appear an entire district is focusing only on certain learning areas. The drawbacks of teaching such a concentrated core has been hotly debated.
Some districts have gotten around the “teaching to the test” question by allowing kids to take “practice tests” so they’ll have a “better idea” of what’s going to be on the test. Practice tests, however, could conceivably get really, really specific.
“We don’t do practice tests because that takes time out of instruction,” said Eidahl. “We would rather spend time balancing equations than stopping to hand kids practice tests so they can practice taking a standardized test.”
Another point Eidahl made seems to have been missed by the Atlanta researchers: He sees a relation between standardized test scores and report card grades from the classroom.
If Sally has a “C” in math class, the standardized tests, in general, should echo that. It would make sense that she’d get a “C” on the standardized test which, after all, is simply checking her skill in math.
It doesn’t look like the district has done anything wrong, reported the Iowa DOE this week. They are still reviewing the data.
“We’re not aware of any cheating allegations in Ottumwa,” said Hupp.
However there are “anomalies,” the ups and downs seen by the AJ-C. DOE analysts say the Atlanta newspaper used a “cookie-cutter approach” in every state, failing to take differences in Iowa into account. Therefore, the year to year scores may not have been in context.
For example: “The methodology they use assumes all schools test at the same time [of year],” said Hupp.
Iowa is one of the few states that allows such year-to-year changes.
“Hypothetically,” Hupp said, “if you are a district and you usually test in the fall (when kids first come back to school) and the next year you switch and take the test in the spring (when kids have had six months of training), the students may do better because they’ve had more time to [learn and] study.”
Ottumwa has switched up its testing dates over the past three years.
In its series on cheating in Georgia and across the nation, the AJ-C said principals and other employees would have “change parties,” where they would make the adjustments to improperly obtained test sheets at an employee’s home, complete with refreshments and plenty of erasers.
Kids who may not know basic skills are being ignored because they passed the test.
The paper suggests programs like “No Child Left Behind” or cash bonuses promised for increased test scores provide a motive for teachers to manipulate answers.
Test scores questioned
As an example of the Atlanta Journal-Constitution’s analysis of “suspect” test scores, these are the percentages of unusual scores for 2008, 2009 and 2010 for four of the 200 listed districts:
2008 2009 2010
Atlanta, Ga. 18.73 15.79 25.59
Yankton S.D. 0.00 5.56 25.00
Muscatine, Iowa 8.33 17.50 12.50
Ottumwa, Iowa 12.24 8.33 10.0