Thursday, March 28, 2013

Law School Rankings: By Full-Time Employment and Debt/Earning Ratio

The following is an excerpt from John's legal profession paper, It's the End of the Law School as We Know It: An Examination Upstream, Inside, and Downstream of the Legal Education Market.
John examines why the legal education market developed a bubble (lax oversight by captured regulators, law schools falsely marketing themselves as $160,000-job factories, and an otherwise weak job market), but also delved into the way law school rankings focus on the wrong aspects of a trade school industry. John created a weighing system for publicly available information on the Class of 2012's full-time employment statistics, as well as average cost-to-earnings ratios. The results might surprise you.
In addition to these two regulatory bodies, it would be naïve to not focus on the market for information and rankings of law schools. Exemplified by the US News and World Rankings,[1] the power that publications have over law school policy would be difficult to overstate because of the great import that law school applicants attach to the rankings.
Multiple deans have resigned after a drop in rank. Schools have altered their admissions formula to maximize their ranking. … Schools have shifted scholarships away from financially needy students owing to the ranking. … The fact that reputation among academics is the most heavily weighted factor in the ranking—25 percent of the score—turbocharged the market for lateral hires, boosting professor pay at the high end. The Government Accounting Office issued a report to Congress concluding that competition among law schools over the ranking is a major contributor to the increase in tuition.[2]
While the US News rankings are somewhat right to focus on reputation as part of the formula—for example, the United States Supreme Court currently only has matriculants to Harvard and Yale—these distinctions have little to do with the actual job prospects that will face graduates.  Additionally, and concomitant with reputation-related metrics, the rankings place a great emphasis on selectivity and competitiveness of the law school’s entering class, prompting law schools to blatantly fabricate statistically insignificant differences in LSAT scores: for example, for several years Villanova reported its median score was 162 when it was 159, and Illinois reported medians of 168 when they were actually 163.[3]  The non-profit that publishes the LSAT, the Law School Admissions Council (LSAC), has publically decried these minute differences in score as irrelevant, and reports its scores in six point bands.[4]
            Remedying these rankings is not as difficult as might first be imagined, as focusing on employment statistics and student debt-to-earning ratios can easily be calculated from publically available information. However, the US News rankings collects its own information, and muddies or allows law schools to massage the employment statistics in the following ways:
Law schools graduates who were employed in a position outside of the legal field, like a grocery clerk, would be identified as “employed” in “business and industry.” … [S]chools left out any graduates who were “not seeking employment” or were pursuing further education … and because US News automatically treated 25 percent of graduates whose status was unknown as “employed,” law schools made less of an effort to get answers from graduates they suspected were unemployed … Finally, law schools offered unemployed graduates temporary jobs—as research assistants or interns at ten dollars an hour—which expired after the period covered by the survey, thus counting them as “employed” when it mattered.[5] 
These obfuscations of the data are not only morally ambiguous, they resulted in law suits against dozens of law schools that alleged fraud and deceptive business practices for misinforming prospective students about job placement rates.
           With publicly available information from the ABA[6] and the Internet Legal Research Group,[7] I constructed a ranking system that weighed post-graduation employment statistics and salary against student debt. Unlike the US News rankings—indeed, unlike any other ranking system publicly available—this proposed ranking system weighs part-time, short-term, school-created, and unreported employment statistics against the school, and compares that metric to the overall number of graduates and their debt to earnings ratios. The result is a ranking system for how likely a school will improve the student’s earning potential and allow them to pay off their debt, while nullifying the known practices that law schools use to distort their data.[8] I might choose to modify the rankings slightly to include financial aid data to better approximate student debt levels, and I'll delve more into the methodology in a later post.




[1] http://www.usnews.com/education/best-graduate-schools/top-law-schools/articles/2013/03/11/methodology-best-law-schools-rankings
[2] http://press.uchicago.edu/books/excerpt/2012/tamanaha_failing_law_schools.html
[3] http://abovethelaw.com/2011/09/another-law-school-caught-in-a-lie/
[4] http://www.lsac.org/jd/lsat/lsat-score.asp
[5] http://press.uchicago.edu/books/excerpt/2012/tamanaha_failing_law_schools.html
[6] http://employmentsummary.abaquestionnaire.org/
[7] http://www.ilrg.com/rankings/law/
[8] http://www.nytimes.com/2013/03/08/education/law-schools-look-to-medical-education-model.html