A New Wrinkle in the Law School Rankings Game

Law School AdmissionsEveryone who operates in the graduate school admissions space knows that rankings are both important and also to be taken with a grain of salt. Any summation of programs presented in list form purports to be an objective analysis, but is always filled with subjective factors, opinion, and incomplete data.

The most famous, of course, is the “peer assessment” score that plays such a huge role and completely stems from opinion — and often uninformed opinion at that. With numbers — average test scores and GPAs, acceptance rates, job placement, and so on — we assume a bit of gaming (using a huge waitlist to improve yield rate is a favorite), but for the most part, trust that we can believe the stats.

Well, as the old expression goes, there are “lies, damned lies, and statistics.” And in the case of the U.S. News Law School Rankings, there are some very interesting statistics indeed.


Yesterday, on his popular blog TaxProf Blog, Paul Caron wrote a spicy piece that wondered aloud whether 16 law schools committed some form of malpractice last year by reporting employment numbers that were lower than the “plug-in” stat that U.S. News uses when there are no numbers at all. It’s kind of a strong word to attach to the actions of these schools, but Caron’s idea is that these programs harmed their various constituents (students, faculty, alumni, and so on) by reporting numbers that unnecessarily weakened their overall ranking.

Why was this reporting unnecessary? Well, the simple answer is that U.S. News has a default number it uses when schools do not report their “employed at graduation” numbers in time for the new set of rankings. Last year, according to Caron, 74 schools did not supply this information, and so each of them got the default stat (created by taking the “employed after nine month” number and subtracting 30 percentage points). The prevailing opinion is that all 74 schools had actual placement numbers that were lower than the x-30% stat, creating an incentive to accept the default number rather than present the real version.

Caron goes on to discuss the 16 schools that did report numbers below x-30%. The accusation of malpractice is a little strong and probably unique to this setting (hard to imagine a blog focusing on MBA rankings going there), but the premise is logical: these schools gave more info than they had to and would have done a better service to their constituents by falling back on the default stat.

But what does that do to the intended audience (prospective students, employers, and so on)? Picking and choosing between actual stats and default formulas does nothing but obfuscate the actual state of the program. It can’t possibly be a good thing when schools are strategically withholding data, or when truth-telling programs are ridiculed for sharing information with the public. Can it?

I don’t blame U.S. News for its formula (after all, it can’t very well just exclude that column of data, or — worse — exclude all the schools that miss a reporting statistic), but it definitely serves as a reminder that we have to keep the machinery in mind when we look a the tidy list of schools, ranked in numerical order. We have long encouraged our clients to take the rankings with a grain of salt, but when you see stories like this, it makes you want to recommend taking the rankings with a whole barrel of the stuff.

If you’re getting ready to apply to law school, our law school admissions consultants can help you get in. Call us at (800) 925-7737 to speak with an expert about your candidacy today. And, as always, remember to subscribe to this blog and follow us on Twitter

Leave a Reply

Spam protection by WP Captcha-Free