New SAT: Just So-So at Predicting College Success

Ever since the new SAT writing test, featuring a proctored off-the-cuff writing sample, was launched in 2005, it has found few fans. Students weren't crazy about having to write a high-stakes essay under time pressure on a randomly assigned topic, or the fact that the new test extended a three-hour college-entrance exam by another 45 minutes. Meanwhile, college admission offices were reluctant to put much weight in a new test of unknown value that hadn't been formally validated.

This week, the nonprofit College Board, which publishes the SAT, gave its critics one less reason to beat up on the writing test. Based on the results of their 2008 national validity study, which analyzed the grades of approximately 150,000 students who completed their first year of college at 110 four-year institutions during the spring of 2007, the SAT writing test proved to be just slightly more predictive of college freshman grades than the math and critical reading components of the college entrance test. (For those who appreciate adjusted correlations, the new writing test earned a correlation coefficient of 0.51, compared to 0.47 for the math test and 0.48 for the critical reading test. A correlation of 1.0 is the highest possible value, and anything over 0.5 is considered a large correlation.)

The new study also validated the recently overhauled SAT, of which the writing test was but one component. The new SAT, introduced along with the writing test in March 2005, put increased emphasis on higher level math, critical reading skills and grammar. Overall, the changes made the SAT less of an aptitude test and more of an achievement test. The SAT validity study concluded that the new three-part test was "not substantially" different, in terms of its reliability, than the old two-part, three-hour SAT which gave only math and verbal scores.

While College Board officials were quick to describe the results as proving that the SAT is an "excellent predictor of how students will perform in their first year of college," they also had to concede that the validation study revealed that a student's high-school grade-point average is a bit more predictive of college freshman-year grades than the SAT. However, the study results also confirmed that the SAT is better at forecasting the success of most minority students during their freshman year. Consideration of both sets of data by college admission offices will predict freshman grades significantly better than either GPA or SAT scores viewed alone, said College Board President Gaston Caperton. The College Board, which publishes the SAT, used its inhouse research staff to conduct the study.

The decision to overhaul the SAT was made in 2001 after the president of the University of California system threatened to stop requiring applicants to submit standardized test results, out of concern that they were having a negative effect on high-school curricula. Specifically, he worried that too many high school students were spending too much time focused on obscure analogies, which made up a significant part of the old SAT verbal test, rather than spending their time writing or critically discussing literature.

Fearful of losing such a huge market or inspiring a copycat trend, the College Board worked with UC officials to overhaul the SAT. The new test introduced a new writing section (multiple-choice questions on grammar and usage, as well as the new student-produced essay, scored by at least two different graders). The verbal section was renamed the "critical reading" test. The analogy questions were eliminated and emphasis switched to comprehension of long and short readings from a broad range of disciplines. The new math section included more advanced math problems. A perfect score was boosted from 1600 to 2400, with a total of 800 possible from each section.

The College Board soon realized, however, that the scores from the revamped SAT, particularly those from the writing test, were getting less than an enthusiastic reception. The disclosure that SAT graders were instructed to overlook factual errors and spelling mistakes on the writing test didn't increase its popularity with college administrators. (Good organization, a clear focus and evidence of "outstanding critical thinking" are among the qualities rewarded.) Neither did the news that a writing professor at the Massachusetts Institute of Technology had conducted an experiment in which he successfully taught a handful of high-school students to game the writing test by writing long and loading up their essays with big words.

Bruce Poch, admissions director at Pomona College, said colleges were also concerned that the writing test might prove to be disproportionately damaging to students stuck in failing schools where they got little individual attention. "Just as we were pushing to knock down barriers and improve access to students across the economic and ethnic spectrums, [our fear was] that another barrier could appear," he said. The New York Times reported in November 2006 that even the University of California system, which had inspired the change, was not yet downloading the SAT essays or using the writing scores in its admission deliberations.

Meanwhile, the SAT's biggest competitor, the publishers of the ACT exam, chose to make their writing test optional, and they have have continued to see their portion of the college-admissions market grow.   

Caperton expressed optimism this week that the validation results would improve acceptance of the SAT writing test. "Many colleges said they were waiting for validity studies before deciding how to use the writing scores," Caperton said. "We expect in the future that all colleges will use and require the writing scores" because "they will help them to choose the students likely to succeed at their institution."

Poch said he expects as time goes on, more schools will start incorporating the score into their deliberations. "I do think more [colleges] are certainly recording the score than may have been the case initially," he said.

But Barmak Nassirian, associate executive director of the American Association of Collegiate Registrars and Admission Officers, said few colleges are likely to view the validation of the writing test "as a transformative gain." The main value of the test, he said, is that colleges "get to see the actual writing" of students in a proctored setting and can use it as a point of comparison with the often heavily edited essays submitted with the application. But they will also recognize that "some [students] write well under deadline and some don't and that may be all this tests shows. Not all colleges will agree that this conveys something profoundly meaningful about an applicant's accomplishments and abilities." Many admission offices will prefer to trust their own judgment concerning the quality of student writing, he said. On that, Poch agrees. "We look at everything but obsess about little," he said.

Join the Discussion