US News Best Colleges Rankings 2011: Changes in Methodology Make Them Less Helpful!

In a previous blog post about the Forbes rankings, I explained why I think understanding the methodology of a ranking is the key to determining if and how a particular ranking is helpful in your own college search.  Since US News is the granddaddy of the college rankings (now in their 26th year), I'm always interested in how they change their methodology each year.

Yes, US News changes their methodology each year.  Why?  Well, according to them, it is because they are ever striving to make the rankings better.  I'm a little more cynical and believe they do it so that the rankings actually change a bit from year to year.  It is a hard to get people to buy an annual ranking publication if the rankings stay the same from year to year.  Why buy the 2011 edition if it has the same list as the 2007 edition?  (If you want proof that these tweaks in methodology do result in variances in the rankings from year to year, check out this chart that compares colleges' US News rankings from 1983-2007.)

Regardless of why US News changes its ranking methodology each year, they do.  So do this year's changes make the US News rankings more or less helpful?  I'm sorry to say they make them less helpful and below I've explained why.  Just to make this more fun to read (as well as more interesting to write), I've imagined a point-counterpoint debate between US News and myself about each of the changes in methodology.  The US News quotes come straight from the horse's mouth.  They can all be found within US News own description of the methodology.   I've taken the liberty of following my own rebuttals with explanations so you understand my thinking.

 

Change #1:  US News has renamed its categories for schools.

US News says, "To make the rankings more understandable and to reduce confusion, for the 2011 Best Colleges we changed many of the ranking category names."

Alison says:  You failed, US News.  Your category system wasn't a particularly good one to begin with, but now it is just a hot mess.

***

Before US News decided to change things, they ranked schools within 4 categories based on the degrees conferred by the institution:

  1. Universities:  confers bachelors', masters' and doctorate degrees

  2. Universities-Masters:  confers bachelors 'and masters' degrees

  3. Liberal Arts Colleges:  confers bachelors' degrees with more than 50% in liberal arts & science majors

  4. Baccalaureate Colleges:  confers bachelors' degrees with less than 50% in liberal arts & science majors

The problem with these categories isn't one of nomenclature, although US News claims it is.  The problem is that these categories don't mirror the way students and families think about their college choices.  Students and families think in terms of selectivity.  Just stop and ask your average 11th grade student or parent of an 11th grade student: "What kinds of colleges are you considering?"  Do they say, "Well, I'm only considering Ph.D. conferring institutions."  No, of course not.  They say, "Well I think I'm a pretty good candidate, so I'm shooting for the most selective colleges, places like Harvard and Williams."  In the US News system, Harvard and Williams are in different categories and can't be compared to each other, while Harvard and Wright State University are in the same category and can be compared to each other.  Come on.  Not a very helpful category system for real people.  I can say I have never had anyone ask me to compare Harvard and Wright State.  They simply aren't the same kind of institution.

To solve the supposed problem with the nomenclature, US News has renamed the categories, but not changed which schools are in them.  Can anyone say lipstick on a pig?  Worse still, the new names are actually misleading.  Universities-Masters become Regional Universities and Baccalaureate Colleges become Regional Colleges, while Liberal Arts Colleges become National Liberal Arts colleges.  Supposedly the "Regional" moniker reflects that the colleges and universities within these categories "tend to draw heavily from surrounding states." Really?  Because I'm pretty sure that all the military academies draw from a national pool of students, but the Air Force and Coast Guard academies are labeled as "Regional Colleges" while Westpoint and Annapolis are considered "National Liberal Arts Colleges."

Change #2:  US News has precisely ranked the top 75% of schools in each category, instead of just the top 50%.

US News says:  "In response to a strong interest from readers in knowing precisely where all schools on their list stand, we've opted to display the rank of the top 75 percent of schools in each category, up from 50 percent. This top ranked group will be called the First Tier. The schools in the bottom 25 percent of each group are listed alphabetically as the Second Tier; which was previously called the 4th Tier."

Alison says:  Be humbler, US News.  You can't possibly believe that you have the ability to distinguish between school #80 and school #81.  Just give us tiers and let us quibble about who is #1, #48, #325.

***

I happen to have been a fan of the US News tier system of rankings because I think they actually tell you something meaningful. In the tier system, schools were grouped by quartiles:  Tier 1 (top 25%), Tier 2 (26-50%), Tier 3 (51-75%), Tier 4 (bottom 25%).  I personally think that this is about as precise as you can be in a rankings system:  ask someone who knows colleges to separate them into the best, the above average, the below average, and the worst and those groupings won't differ much from person to person or from criteria to criteria.  But ask for precise rankings and you see wide divergences because in order to be that precise, you have to start splitting hairs and deciding which hairs are more important.

US News thinks they have the ability to be precise.  They say that "the data are complete enough to numerically rank more schools given our robust methodology. The quality of the data we collect has improved over the years, so that it is now rich enough to rank more schools numerically." I challenge that.  Lots of their data is suspect (see more below about reputational surveys and the new graduation rate measure) and the weights they assign to various data are not all that defensible (should reputation be more important that retention and graduation rates or quality and contact with faculty?).

Beyond that, their regular changes in methodology ensure that the rankings move from year to year, so how meaningful can a precise ranking be?  Far more meaningful is that a school consistently shows up in a particular tier.  For example, Brown has been as high as 7 among the national universities (1985) and as low as 17 (1992, 2003, 2004), but has always been in the top 25.  Obviously Brown is consistently in the top tier.  Do you really need to know more than that, and could you really prove to me that Brown is really 7 instead of 11 or 15 or 17?  I don't think you could and I don't think US News can either.  Besides, when you are doing a college search, a tier is enough to give you some basic guidelines.  Then you move on to understanding the nuances that really distinguish colleges and give each college a personality as distinct as a fingerprint.

Change #3:  Graduation Rate Performance (a comparative measure of predicted vs. actual graduation rates) has been given more weight in the ranking.

US News says:   "Graduation rate performance is more heavily weighted. This measure now accounts for 7.5 percent of the final score (compared to 5 percent previously) for National Universities and National Liberal Arts Colleges. This variable—the difference between a school's actual graduation rate and the one predicted by U.S. News based on the students' test scores and institutional resources—has been well received by many higher education researchers because it's a measure of educational outcomes and also rewards schools for graduating at-risk students, many of whom are receiving federal Pell grants. This means that schools can benefit in the Best Colleges rankings by enrolling and then graduating more of these at-risk students."

Alison says:  Pay attention to your customers' needs, US News.  You don't belong in the middle of educational debates about how to get more at-risk students enrolled at college.  You are providing a service to students and families who are using your rankings in the college search.  They do care about the outcomes, so give them the best data available about that, not some cooked up prediction you make.  Satisfy them, rather than pandering to critics in higher education in an effort to rehabilitate your own reputation within the higher education community.

***

I have always been skeptical of this so-called "graduate rate performance" measure.  It compares US News calculated predictions to actual college outcomes.  Why do we care about a prediction when we have the outcome?  Predictions are only valuable when you can't know the outcome, but still need to make a decision.  Once you know the outcome, you should and do use that information in your decision making.  So if you are concerned about a college's graduation rate (and everyone should be), then you consult their actual graduation rate, not the US News prediction of its graduation rate.

Interestingly, US News reveals why it even uses this measure and it isn't to help their customers.  Instead, it is a "make nice" gesture to the higher education community who bemoan how the US News rankings (and others like them) discourage schools from all sorts of behaviors that serve loftier goals in higher education, including enrolling more at risk students.  US News is a commercial enterprise and the needs of its customers should come first.

Worse, I think that US News has sacrified its customers' needs for nothing but a token gesture.  Supposedly by including this measure, US News compensates for how its other measures penalize schools for enrolling at-risk students.  But it doesn't.  This measure accounts for 7.5%; the measures that are negatively affected by enrollment of at-risk students (freshman retention, graduation rates, student selectivity) account for 35%.  7.5% hardly offsets 35% -- you do the math.

Instead of this "make nice" gesture, I wish US News had beefed up its data on college outcomes.  What about adding the percentage of students who graduate in 4 years (not just 6)?  What about adding the percentage of graduates who are gainfully employed or enrolled in graduate school within 6 months?  All that data is readily available, verfiable and helpful.

Change #4:  US News has included the results from a survey of pubic high school counselors in the calculation of a school's reputation AND published it as a separate ranking.

US News says: For the first time, the opinions of high school counselors—a font of firsthand information about the schools their graduates attend—are factored into the ranking calculations for National Universities and National Liberal Arts Colleges…  [We surveyed] 1,787 counselors at public high schools from nearly every state plus the District of Columbia that appeared in the 2010 U.S. News Best High Schools rankings… The counselors' response rate was 21 percent.

Alison says:  Get with it, US News.  It is already easy enough to challenge the validity of your rankings because of the inordinate weight given to reputational surveys.  Now you compound the problem by adding in the results of a survey that had a bad sample set and a pathetic response rate.  Hardly first-rate data.  In fact, it appears that your data amounts to the opinions of 375 people.  Not very impressive and certainly not worthy of being published as a separate ranking.

***

Now, US News will get no argument from me that college counselors are a font of wisdom — after all, I am both a college counselor and I'm certainly a font of wisdom.  And if US News is going to persist in using "reputation" as a fundamental criterion for ranking schools (reputation counts for 22.5-25% of a school's ranking), I suppose college counselors are a pretty good group to survey.  But this survey and its data hardly represent the collective wisdom of college counselors.  First, why would US News limit itself to counselors from public schools that it ranked in another ranking?  No idea, but it is hardly a valid sample set.  What about private school counselors, a lot of whom specialize in college counseling, and what about independent college counselors like myself?  We are easy to identify through professional associations, so it really puzzles me why US News couldn't go to the trouble of selecting a valid sample set.  The only theories I can formulate don't make US News look good, so since they are nothing but theories, I won't offer them.

Second, the response rate was pathetic.  If you combine the limited sample set surveyed with the low response rate, you discover that 375 public school counselors had inordinate power this year.  Really I don't care who they are -- the opinions of the smartest 375 people in the world shouldn't comprise 7.5% of a school's ranking.  One interesting side note/back story here.  There were emails that flew between counselors within professional associations about whether the counselors who did get the survey were going to boycott it.  Many voiced their belief that rankings are more harmful than helpful and indicated they would boycott.  I suspect the response rate reflects this.

Bottom Line?

US News says:  Better than ever.  (Okay that's my summary, but I don't think they would argue.)

Anna says:  Worse on all counts.