Good Accountants are Hard to Find

by Walter R. Tschinkel

Professor of Biological Science
Florida State University
Tallahassee, FL 32306-4370

phone: 644-4489

Recently, Governor Bush and Commissioner of Education Tom Gallagher enthusiastically reported that the Administrationís A+ Plan was a terrific success. Failing schools were way down, and "A" schools up. It proved the wisdom of the Administrationís A+ Plan, they said.

An amazing success story!

Or is it? Two facts raise doubts. First, the last two years are only the continuation of a five year trend of improvement. Second, the Board of Education changed the rules for calculating grades, leaving students who transferred ("mobility") out of the calculation for the first time. There were other differences as well.

Does this matter?

Yes, it matters. Comparing grades for the two years without emphasizing the change in methods is bad science, and may either hide or exaggerate differences. Imagine that you hired the accountant, Adam Collums, to track your total wealth. For 1998, he includes all your assets except your savings accounts; for 1999, he includes the savings, but not your mutual funds; in 2000, he includes everything except capital gains. Would you be confident that you know how your assets are doing? Of course not. In both accounting and education, the basis must be the same every year for the comparison to be meaningful. Any changes of the basis must be clearly stated, and their effect on the outcome made public. This has most definitely not been done.

Including or excluding transfer students affects the school average. Briefly, students from poor families are more likely to transfer and more likely to do poorly on standardized tests (see my two previous Democrat columns about the poverty and school-performance at ). Therefore, schools with high student poverty also have high mobility rates and low test scores. For Leon County, every 10% increase in students on lunch-support (i.e. from poor families) is associated with a 3.2% increase in the transfer rate (Fig. 1). About one-tenth of the faces in a class in a low-poverty school are replaced by new ones during the school year. In a high-poverty school about half are replaced.

Fig. 1. As the proportion of students from poor families increases in a school, the proportion that transfer in and out increases.

Next, poor students make up a larger proportion of transfers than they do of the school as a whole. In a representative Leon County middle school (the school and the data are real), about 28% of the students are poor, but 40% of those that transferred were poor (with higher poverty, this proportion will increase). In 1999, more-affluent students averaged in the 66th percentile in reading if they did not transfer and the 59th if they did. Poor students averaged in the 38th percentile (!) if they did not transfer and the 28th (!!) if they did. The effect of poverty is 28 to 31 percentile points, of transfer 7 to 10, and of both, 38.

Clearly, leaving these lowest-performing students out of the calculation will boost a schoolís average, but could it change its grade? Grades are handed out largely on the basis of the percent of students exceeding certain criterion scores (referred to as level 2 and level 3) on the reading and math FCAT for grades 4, 8 and 10. In our representative middle school in 1999, eliminating transfer students from the calculation increased level 2-and-above by 4 and 5%. About 14% of Leon County schools were within 5% of the criterion for a higher grade. So if we recalculated 1999 grades using the 2000 formula, one of seven schools would probably qualify for a higher grade (other factors enter in too). Higher mobility and poverty rates might well result in larger effects.

Nothing has changed here, except the way we calculated the averages (and this is an election year). Unless we use the same basis for both years, we cannot attribute changes in scores and grades to improved school performance. If we change the basis, we are obligated to be honest and up-front about the changes.

My second point. Although the Bush administration claims the credit for raising school performance, the trend began much earlier than the A+ plan (1998-99). Standardized tests were first used to estimate school performance in 1992, when only 48% and 55% of Leon County students performed at level 2 or higher (reading and math). Beginning in 1995-96, scores increased an average of 4 to 5% per year (Fig. 2). Not surprisingly, the rise was greater in the high-poverty, low-performing schools. When schools focus narrowly on the task, poor children improve in reading and math. Overall, scores are currently over 80% and still rising. This 30% improvement is an achievement for our schools, but it is associated with the advent of accountability, not the A+ plan.

Fig. 2. The proportion of Leon elementary students performing at minimum reading and math level or above has increased steadily since 1995-96.

The current system of evaluating schools serves a public policy aimed at improving public education, basing reward and sanction on grades. But good and fair public policy must be based on real understanding and accurate reporting. If the public is to have confidence in the outcome of these policies, the current confusion and inconsistency in data collection, analysis and reporting must be replaced with a system of greater clarity. Perhaps the A+ Plan really deserves a large measure of credit, as its authors claim, but how are we to know, when the analysis is obviously and deeply flawed?