For a man whose professional passion in life is statistics, Dr. William Sanders has devoted an enormous amount of his career to education. The last 21 years, in fact. Sanders doesn’t necessarily mind, although he shrugs his shoulders in some wonderment at the whole notion. “This is not something that I planned,” he said. “This is something that fortuitously happened 21 years ago.”

The “something” Sanders mentioned is his off-the-cuff invention of a system to evaluate student achievement gains. Speaking at a North Carolina Education Alliance Headliner luncheon inside North Carolina State University’s McKimmon Center, Sanders related the unintentional development of the value-added student assessment method. Along with Dr. June Rivers, Sanders now heads the Value-Added Assessment and Research Program at the SAS Institute in Cary, N.C.

A better example

While a faculty member at the University of Tennessee at Knoxville, Sanders noticed a news item criticizing the use of achievement data and statistics in student assessment. He noted that the news conclusions were correct, but for the wrong reasons. When Sanders walked into his advanced linear modeling techniques class, he cited the news story’s faulty statistical method. Sanders said he “pulled the value-added system out of the air” as counter point to the botched news report. The method Sanders outlined, in a spontaneous classroom example, eventually became the cornerstone of Tennessee’s student assessment system.

What schools needed, Sanders knew, was a method for assessing gains in student achievement over time, using a technique that would stand up to the usual problems that frustrate that type of data collection. Those problems include incomplete data, following each child’s individual progress, and establishing a baseline for the change in each child’s performance.

Sanders was allowed to test his proposed system in Knox County, Tenn. He obtained school roll books and began to extract information. Using college students, and what Sanders describes as “about 200” computers, student test data was logged and correlated with who taught each child.

Sanders wrote his report to the county, and submitted it in September 1982. Knox County apparently wasn’t expecting much, if anything, from the research. When Sanders informed them that “I’m through,” they asked “With what?”

Not until 1990 did the Tennessee Value Added Research and Assessment Center, and Sanders’ method, come to life in Tennessee. By then, Gov. Ned McWherter realized that there had been no improvement in academic achievement in the state, despite the appearance of A Nation at Risk and other school reform proposals, since the mid-1980s.

Today, the Tennessee Value Added Research and Assessment Center has the largest collection of longitudinal student data in the country, Sanders said. Tennessee has been able to follow students from second grade through to college. Instead of measuring one group of second-graders in one year, against a different group of second graders another year, the Tennessee system measures the same childrens’ progress from year to year. “By following growth over time, the child serves as his or her own ‘control.’ This enables the partitioning of school system, school, and teacher effects free of the exogenous factors that influence academic achievement…”

With these individual measurements, researchers get a picture of how much academic growth each child achieves yearly. Sanders’ method also associates the teacher with the child. That tells researchers about the “teacher effect,” the impact of the teacher on academic growth.

Tennessee value-added system

The Tennessee Value Added Assessment System includes data on the state’s comprehensive test, results from annual tests in math, science, social studies, reading, and language arts for third through eighth grades, and end-of course tests for five high school math subjects. The Sanders model is robust enough to allow substitution among variables that have the same characteristics. That means the results are not necessarily sensitive to a change in specific tests used by the school systems.

Analysts collect three years of data before assessment occurs. And the state will not use the TVAAS data alone to evaluate a teacher, school, or school system. Promotion, attendance, and dropout rates are also used in the state’s accountability system.

Sanders used a physical growth curve to draw an analogy to his method. If we plot a child’s height during the growing years, we will often find that it is not a smooth curve. It is still possible, Sanders said, to find the trend line and possibly make some predictions about the child’s future height. Likewise, we could plot a “math growth curve” using annual math test data. The model converts raw scores and grades into a statistically comparable format before looking for trends.

As in physical growth, learning can accelerate, decelerate, or remain flat. Observations in one child alone don’t tell us much about a teacher, but a flat line for most children, Sanders said, could be evidence of something going on in that class that year. Racial, socioeconomic, and other factors wash out of the analysis when each child serves as their own ‘control.’

According to Sanders, the effectiveness of the individual classroom teacher is as much as 20 times more significant in a student’s annual progress as any of the remaining variables, including class size.

Growth and No Child Left Behind

“The intent of No Child Left Behind is to set an academic floor,” Sanders said. But he argues that states and districts have to go beyond NCLB, particularly when it comes to adequate yearly progress. Adequate yearly progress measures average performance of different categories of children. There are serious sanctions if schools fall short under the No Child Left Behind Act. Incentives to avoid sanctions may have unintended consequences for some students, he argues.

One incentive is to focus attention on children who are nearest the adequate yearly progress achievement standard. They are the ones who can “make or break” the school’s rating. Sanders predicts that students far below, or well above the standard, won’t get the same attention. Academic growth rates will be highest for the almost-proficient children, and lower for low and high-achieving students. Two out of three groups of children won’t be achieving at their potential.

Using the techniques developed in their program, Rivers and Sanders can plot projected academic growth rates for different students. “Because you can see what the growth trajectory has to be to get kids to make appropriate progress, you can tell whether they are on a path to make it,” Sanders said. Low-performing students will not improve rapidly enough if all they do is meet adequate yearly progress, he said. “The trick is to get the gain rates to where you can create a progression like compounding money. That’s the way you ratchet academic achievement to higher and higher levels.”

Sanders’ studies have found that one year with a poor teacher has a noticeable effect on a child’s academic growth. Two or more years of poor teachers can devastate achievement. There is little evidence of “compensatory effects” from later help. Sanders found that years of service are strongly related to effectiveness. Effectiveness increases in the early years, and typically plateaus at the 22nd year. After that, there is a lot of variability, he reports. Teachers who leave the profession after one or two years have been the most ineffective teachers, his studies show.

Class size, Sanders said, has an effect only when it approaches the level of private tutoring. Reducing class size by two, or five, Sanders said, is insignificant. Instead, he argues for variable class size, determined by the needs of the students. Low-achieving students benefit most from smaller classes. High-achieving students don’t benefit as much, and don’t need them.

The general findings of value-added assessment, as seen in Tennessee, are that the teacher is as much as 20 times more significant than any other factor in student achievement growth. Disadvantaged children make as much progress as other children with the same teacher, and schools in poor and minority areas are as effective as other schools.

Sanders knows his views won’t be politically popular. “I try to let the data speak,” he said, ”and I’m telling you what the data say.”

Palasek is a policy analyst with the North Carolina Education Alliance and an assistant editor at Carolina Journal.