So let's take a closer look at the situation, as the official ranking is out and confirms that post.
First off, if we compare this rating to last year's ratings, UW hasn't dropped THAT far. Last year, we were tied for 34 with three other universities. Below that, Georgia Tech was tied for 38 with two other universities, and Illinois was ranked at 41.
They don't even have a ranking of top national universities, they just take four and list them. How comprehensive.
So, here's the question: where does the ranking come from? These are these incredibly complicated catagories, but it basically breaks down as such:
-Retention - 20 percent
-Faculty Resources - 20 percent
-Student selectivity - 15 percent
-Financial Resources -10 percent
-Graduation rate performance - 5 percent
-Alumni giving rate - 5 percent
but the biggest category is the most disputed:
Peer assessment (weighting: 25 percent). The U.S. News ranking formula gives greatest weight to the opinions of those in a position to judge a school's undergraduate academic excellence. The peer assessment survey allows the top academics we consult—presidents, provosts, and deans of admissions—to account for intangibles such as faculty dedication to teaching. Each individual is asked to rate peer schools' academic programs on a scale from 1 (marginal) to 5 (distinguished). Those who don't know enough about a school to evaluate it fairly are asked to mark "don't know." Synovate, an opinion-research firm based near Chicago, collected the data; of the 4,089 people who were sent questionnaires, 58 percent responded.
So an entire quarter of the score is dependant on academics and administrator's scorecard based on whatever can't be quantified? Hmm. If that's so important, I better see MIT take a hit in the rankings (It did, dropped 3 places). In any case, quite a few people find this rating incredibly biased, such as the Education Conservancy, who sent out a letter to college presidents in 2007:
Among other reasons, we believe [...] rankings: imply a false precision and authority that is not warranted by the data they use;obscure important differences in educational mission in aligning institutions on a single scale;say nothing or very little about whether students are actually learning at particular colleges or universities;encourage wasteful spending and gamesmanship in institutions' pursuing improved rankings;overlook the importance of a student in making education happen and overweight the importance of a university's prestige in that process; and degrade for students the educational value of the college search process. We ask you to make the following two commitments: 1. Refuse to fill out the U.S. News and World Report reputational survey. 2. Refuse to use the rankings in any promotional efforts on behalf of your college or university, and more generally, refuse to refer to the rankings as an indication of the quality of your college or university."
These universities eventually signed the letter, but I can't imagine how the impact can be approximated (or if there even IS an impact.). In any case, at least 25 percent of these ratings are pretty subjective.
I would like to give that breakdown, but I don't really feel like paying 15 bucks for the details. As soon as their book comes out, I'll head out to the Barnes and Noble and jot down the figures, but for now, take everything in these ratings with a grain of salt.