Mythili Rao
Wednesday, February 28
“The MCAT’s power comes from its use as an indicator of your abilities,” says the introduction to Kaplan’s 2007 MCAT prep book. “Good scores can open doors. Your power comes from preparation and mindset, because the key to MCAT success is knowing what you’re up against.”
Test-taker Daniel Sonshine of Brown University was well-versed in Kaplan’s test techniques for the medical school entrance exam, but on Jan. 27, when he turned to the verbal section of his computerized MCAT after “nervously chugging along through the physical sciences,” he had trouble staying calm.
Per Kaplan’s advice, Sonshine sought out the easiest verbal passage first. Having settled on a passage about robotic fish, Daniel read it and then turned to the accompanying questions. The questions were about warblers.
“I was totally caught off-guard,” Sonshine told Campus Progress. “My mind was swimming.”
Sonshine had stumbled upon a rare and completely befuddling testing error. According to the Association of American Medical Colleges, which administers the Medical College Admissions Test, a total of 787 examinees out of the some 2,400 who tested that Saturday encountered the same incoherent passage-question pairing. These students were left staring at the incomprehensible task–“a square peg and a round hole,” as one student put it–as precious minutes ticked away.
Sonshine spent seven minutes reeling. Then he “struggled, struggled through,” the rest of the section, trying to put the passage behind him.
Weeks later, Sonshine and some 700 others are still struggling to understand what happened and how to proceed as medical school application deadlines loom, and slots for future testing dates remain booked.
Dr. Robert Jones of the AAMC describes the error as a fluke. “Well, first it was a human error, not a computer error. It resulted from a failure of our review processes. All test forms are reviewed by several people in different organizations at a detailed level,” said Dr. Jones in an email. “For inexplicable reasons, this error was not identified prior to release. We are still investigating how that happened.”
In the meantime, the AAMC has promised to deliver “comparable” scores based on the unlucky students’ performances on other sections of the test using “a special comparison table” to calculate these new scores–or just give students the option to void their scores and have their test fee fully refunded.
Popular
"swipe left below to view more authors"Swipe →
If students take their chances on the “special comparison table,” the scores will be reported without special comment or notice. “No disclaimer is required,” said Dr. Jones. “We have no reason to believe that the calculated score they receive based on the other items will not provide a fair estimate of their ability.”
But given the nature of the test, FairTest, a nonprofit watchdog group which monitors the standardized testing industry, is calling the remedies offered to students “inadequate.” FairTest spokesman Bob Schaeffer argues that the synthesized approximated scores can’t be valid–not only would calculations be based on a fewer number of questions (giving students who had flawed exams a disadvantage), they also wouldn’t account for the anxiety and loss of concentration produced from seeing incoherent questions, not to mention the time lost from the confusion. Schaeffer believes that asking students to retest disregards the “huge time and money students put into test”–as well as the constraints of program deadlines. Some Public Health programs, for example, have deadlines as early as next month.
Test-takers are agreeing with him. An online forum on the Student Doctor Network devoted to the topic of this specific MCAT gaffe displays some 200 posts from disgruntled test-takers who are upset about the error and, like FairTest, see the alternatives offered by the AAMC as inadequate. One student, 33-year-old small-business owner William Hibbitts, had enrolled in a two-year post-baccalaureate program at Mills College to become a medical school applicant. As the MCAT date approached, he took 3 costly months off of work to study for the life-changing test.
When he encountered the rascally robo-tuna and warblers at his testing site in California–hours after East Coasters had faced the same glitch–he racked his brains, and then, stumped, sought out his proctor. The proctor whispered that she’d been warned of an error in the test, but said she’d been instructed to simply ask students to do their best. He returned to the test shaken and aghast.
Now he is livid “to have spent two years preparing for a test that I didn’t get a fair shake at.” Hibbitts considered taking legal action against AAMC officials, but decided against a tortuous legal battle. But the whole episode has left a bitter taste in his mouth, not to mention the conundrum of how to remain a competitive applicant after his botched test-taking episode. He wishes the AAMC would make amends. “Their response was horrendous,” he said.
The AAMC’s stubbornness in this incident stands out. On the SDN’s online forum, user mc4435 posted that “a message popped up on the proctor’s screen that said, ‘If students ask about a passage involving fish and birds, tell them to try their best to answer the questions.’ Once I got to that section of the test I knew what they were talking about, and just figured they would throw those questions out. But I was confused as to why AAMC would ask students to even try answering the questions.”
Standardized test-errors aren’t unheard of. In fact, Schaeffer sees the glitch as all too routine. “The error on the first computerized version of the MCAT is unfortunately typical of the problems the test industry faces when they roll out new tests in a computerized format,” he said. He cited goofs in the content of the computerized GRE in 1999 and the GMAT in 2001. He described the notorious “black screen of death” that plagued some test-takers in 1997. “You finish taking the test and ask it to tell you your score and the screen would go black and it would crash.”
“You would think that companies that administer products that play a major role in school admissions would take time to get it right,” said Schaeffer, who is critical of the AAMC for both causing the error as well as for their carelessness in the aftermath of the incident. “They still haven’t given a credible, coherent explanation for how this error could happen,” he continued. “The testing industry has so little accountability. The larger issue from our perspective is, why should these flawed tools be allowed to make lifelong decisions?”
It’s hard to overstate the influence–and in the face of such a glaring error, the fallibility–of the MCAT. Sonshine says his brush with the AAMC’s bureaucracy was eye-opening. “What’s bubbling to the surface, and what scares me” he said, “is that this organization has a monopoly on the business of making doctors.”
Mythili Rao is a freelance journalist in New York.