For the sixth consecutive year, Forbes Magazine raced to beat both the Princeton Review and US News and World Report to be the first to publish its annual college ranking.
And number six is not too much better, useful, or relevant than numbers one through five. But it is a little different, largely due to a little tinkering with the evaluation formula and punishments issued to colleges that misrepresent their data.
As always, it’s embarrassing to see how many college publications rise to the bait and run the list. Even more embarrassing are the press releases from colleges basking in the glory of so much as a mention—even if it’s all the way down at No. 197
Happily, most of the Forbes top ten colleges ignored the honor and didn’t dignify the list with as much as a web note. For their part, Stanford (1) and Princeton (3) were much more interested in conservation strategies for remote areas of the world and questions surrounding the recent uptick in peanut allergies among children.
Only Pomona College lent credibility to the ranking by acknowledging the “flurry of media coverage” accompanying its placement just behind Stanford in the top ten list.
Yet even with a little behind-the-scenes tinkering, the Forbes list does not quite pass the reliability test and can’t get beyond the fact that some of the numbers used to generate their list have limited validity. RateMyProfessors.com (15%), Payscale.com (15%)?
At least this year, Forbes dropped Who’s Who in America from among its metrics determining post-graduate success. And an effort was made to dampen the impact RateMyProfessors has on the student satisfaction metric.
For the record, RateMyProfessors.com is a compilation of opinions shown to be largely from very happy students OR very UNhappy students—not much in between. And, if anyone would bother to look, RateMyProfessors is becoming increasingly obsolete as a rating tool as colleges create and post their own private rating websites (see Stanford’s site for a good example).
Equally ridiculous as a serious evaluation tool, Payscale.com invites readers to self-report salaries. Not only is there no possible way to judge the accuracy of this information, but it also usually represents a very small and select group of recent graduates.
But in a nod to a few more relevant (although not always 100 percent accurate) measures of both student satisfaction and post-graduate success, Forbes used freshman retention rates, student debt, and graduation rates in the mix of numbers factored into its evaluation algorithm. They also came up with a few new ways to consider whether schools are producing leaders in their professions.
And surprise. The new formula resulted in some movement on the list.
For the first time, the Forbes Top Colleges ranking has two non-Ivies at the top: Stanford University (1) and Pomona College (2). Harvard dropped down to No. 8, while Cornell bumped up from No. 51 to No. 19. And the University of California, Berkeley took over as the highest ranked public institution at No. 22.
Locally, the University of Virginia hopped up seven places to No. 29, while the College of William and Mary dropped from No. 40 to No. 44. Georgetown zoomed up to No. 26 from No. 38, while the U.S. Naval Academy went from No. 44 to No. 28. Washington and Lee University dropped from No. 15 to No. 21. And Johns Hopkins made its top 50 debut at No. 46.
This year Forbes instituted a new penalty for schools falsifying data to the U.S. Department of Education, on which the magazine bases some of its calculations.
In the past two years, four schools have admitted to submitting inaccurate data: Bucknell University, Claremont-McKenna College, Emory University and Iona College. As a result, they have been removed from the Forbes rankings for two years.
But just for the sake of argument, why wasn’t George Washington University similarly punished? Could it be one more totally arbitrary decision on the part of Forbes?