![]() |
Image: TrinityPrep and Fox |
This is no coincidence—test prep companies like Barron's likely have economic incentives to design their practice tests questions in a manner such that they are more difficult than what one is likely to face in an actual testing situation. But what are these incentives?
Strange as it seems, an experiment with weather forecasts in Kansas City could shed some light.
In April 2007, Kansas City resident J.D. Eggelston's fifth-grade daughter was assigned a school project to keep track of the accuracy of weather forecasts for a week. Intrigued, Eggelston decided to continue his daughter's project—for seven months. From his home, he kept track of forecasts—namely, the chance of precipitation—by the local weather station and compared these forecasts to the results he observed.
![]() |
Chart: "The Signal and the Noise" |
As the graph above indicates, the local meteorologist was far off the mark, with a pronounced bias towards forecasting a much higher probability of rain than the observed frequency. When, for example, he reported with certainty that it would rain the next day, he was wrong one out of three times!
And it's not that this is just some rogue meteorologist at work in Kansas City. "Rain bias," as it is known, is displayed by well regarded weather forecasters as well. Even The Weather Channel, a forecasting company reputed enough that it is occasionally confused for a government agency, tends to overstate the chance of precipitation. Typically, when there has been a 5% chance of rain the next day, The Weather Channel reports the odds as being four times higher, at 20%.
Eggelston was baffled by his results, and got in contact with representatives from the weather station. He was surprised by what he learnt. "Presentation takes precedence over accuracy," he was told by one forecaster. The overall message he received was that meteorologists didn't care much for the validity of their forecasts, so long as they did not miss predicting a rainy day. This is because people tend to curse the weatherman who ruins a day at the beach by failing to foresee rain, but they'll take unexpected sun as fortuitous.
In the weather forecasting industry, there is a strong economic incentive to avoid a false negative, incorrectly predicting that it will not rain, and instead produce a false positive, forecasting a rainy day only for it to be sunny.
With that in mind, let's go back to Barron's.
Generally, when you take a mock test shortly before an exam, you read the mock score as a benchmark level. Ideally, you want your final score to be higher than what you achieved on the practice test, and certainly not lower than it.
If you scored 740 out of 800 on a practice test and then got a 760 on the actual test, you would come away very satisfied with your effort. Importantly for the test prep company's executives, you might even be inclined to regard your prep book as having challenged you positively and given you extra motivation, leading to a better-than-expected score!
On the other hand, you would be not at all pleased had you scored 780 on the practice test but managed only 760 when it mattered. Although the score was the same both times, you are immensely dissatisfied with having scored below your target, and you might even—God forbid!—blame that same test prep book for instilling false confidence in you.
Clearly, from a sales perspective, it makes good business sense for those test prep executives to ensure that you rarely, if ever, score better on their practice tests than you will on exam day.
So the next time you are disappointed with a mock test score—or are put off by a forecast for a rainy day—take heart, and have a think about the economic incentives that might be at play.
Good luck with exams—and Happy Mother's Day!
Generally, when you take a mock test shortly before an exam, you read the mock score as a benchmark level. Ideally, you want your final score to be higher than what you achieved on the practice test, and certainly not lower than it.
If you scored 740 out of 800 on a practice test and then got a 760 on the actual test, you would come away very satisfied with your effort. Importantly for the test prep company's executives, you might even be inclined to regard your prep book as having challenged you positively and given you extra motivation, leading to a better-than-expected score!
On the other hand, you would be not at all pleased had you scored 780 on the practice test but managed only 760 when it mattered. Although the score was the same both times, you are immensely dissatisfied with having scored below your target, and you might even—God forbid!—blame that same test prep book for instilling false confidence in you.
Clearly, from a sales perspective, it makes good business sense for those test prep executives to ensure that you rarely, if ever, score better on their practice tests than you will on exam day.
So the next time you are disappointed with a mock test score—or are put off by a forecast for a rainy day—take heart, and have a think about the economic incentives that might be at play.
Good luck with exams—and Happy Mother's Day!
Comments
Post a Comment