By Albert Samaha
By Steve Weinstein
By Devon Maloney
By Tessa Stuart
By Alison Flowers
By Albert Samaha
By Jesse Jarnow
By Eric Tsetsi
Three weeks ago, employees at the Kaplan Educational Center, the standardized-test prep company, received a surprising e-mail. After more than five years of research, the Educational Testing Service (ETS) was introducing the e-rater, an electronic essay reader and scoring system. Starting February 10, prospective business school students taking the GMAT will have their essays graded by both a human reader and a computer. The tests, previously read by two people, will now get the eye of a second human only if the e-rater arrives at a glaringly different score.
Naturally, this has created a buzz among education activists and academics, many of whom fear that software such as this may reduce creative argumentation to formulaic drivel. Programmed using sample essays scored by humans, the e-rater looks at syntactic variety, discourse, and vocabulary. (GMAT administrators claim that 91 percent of the time the e-rater and the human give the same score.) As ETS moves from paper-and-pencil to computer-based testing, the potential of the latter to discourage future Montaignes from playful rhetoric is only one of a series of complaints opponents of standardized testing have about the company.
Among those who protest the use of the e-rater is Upper West Side assemblyman Ed Sullivan, chair of the higher education committee. "I have no problem with students taking tests on computers, since everyone should leave high school computer-literate," he says. "But human beings should gauge the results. The variation of the English language is so great. You can't have a machine test the subtlety of understatement, or dry wit."
But comments such as these leave Jill Burstein exasperated. A development scientist at ETS who worked on the e-rater, she thinks critics who fear the program will stifle creativity are missing the point: "I'm sure it's very politically correct . . . very boho to bash computers for scoring writing. But people don't buy a book of GMAT essay responses to read for pleasure. Creative writing is not what's being tested. [The e-rater] is a new thing, new technology, and that's always hard to accept."
Burstein may be right; the fuss about the e-rater might wane in a few weeks. But larger problems linger. Standardized testing watchdog group FairTest points out that using a computer program in place of a human to score the essays reflects a more insidious trend: as the testing industry cashes in on a nationwide fascination with standardized tests as measures of scholastic achievement, it may not have the public's interest at heart.
Bob Schaeffer, public education director of FairTest, says that the major motivation to switch to computer-based testing is profit. In fact, the e-rater will cut ETS's per-test cost by $50. But Schaeffer questions whether that will in turn affect the $150 price tag of the GMAT. A spokesperson for ETS says that the money saved will be used to "contain costs on the GMAT," but won't predict whether that will mean a lower registration fee.
ETS, a nonprofit, administers 11 million tests a year. As the largest educational testing company in the world, it took in $417 million last year. The president has a six-figure salary and a $54,000 expense account. But to cut corners, according to Schaeffer, ETS began to computerize its tests in 1992. The GMAT is only available on computer, and the pencil-and-paper GRE is currently being phased out. While the switch has been slow, the revenue generated has been steady. Computerized testing is estimated to be at least a $750 million business, with publishers like Houghton Mifflin and Harcourt General also vying for a cut.
The company most frequently recommended by stock analysts, however, is Sylvan Learning Systems, which, thanks to an exclusive contract with ETS (through 2005), gives the computerized versions of ETS's exams. (ETS owns a minority stake in Sylvan, and also has a for-profit subsidiary, Chauncey Group, that specializes in employment testing.) In 1997, the year the GMAT was first offered electronically, Sylvan's revenue increased from $217.9 million to $298.7 million.
Of course, the test companies themselves are hardly the only ones to profit from the testing biz. Kaplan and The Princeton Review have flourished since the GMAT went digital. Prep classes such as these cost an average of $1000 and are one reason opponents remain skeptical of standardized testing, since students who are financially privileged have a distinct advantage."You don't know if the score you see is the result of the kid taking the test cold, or [if] their parents had enough money to pay for the coaching," says Schaeffer.
With its strong market ties, and without any federal agency or congressional committee to monitor the industry, ETS's position seems solid. Despite public outcry over the cultural and gender biases of standardized tests, they continue to elicit strong and far-reaching support from politicians. When explaining this phenomenon, Schaeffer takes no prisoners. "We're going through a period in which the emphasis is on accountability," he says. "The focus is to improve educational quality, which is good, but we have latched onto testing as a way to do that. This is bipartisan stupidity, with George W. Bush and Bill Clinton both pushing it. They have no data to prove their case. If anything, reliance on these tests ends up dumbing down the populace."