What Happens to Bar ‘Never-Passers?”

September 2nd, 2009

Jane Yakowitz (UCLA, Empirical Research Group) has posted on SSRN an interesting working paper, The Marooned Law School Graduates: An Empirical Investigation of Law School Graduates that Fail the Bar Exam, that looks at various sources with a view to establishing was happens to “never-passers”–law graduates who never pass the Bar exam.

Gary Rosin

Components of US News 2010 Rankings (III)

August 31st, 2009

Tom Bell has posted the third installment of How Top-Ranked Law Schools Got that Way.  In this post, he focuses on schools ranked 41-51, 94-100, and the bottom 8 (according to his model).  What I would liked to have seen, instead, are schools on either side of the major breaks:  2oth, 50th, 100th, Tier 3/Tier 4. If peer reputation is the biggest influence on rankings, what moved schools above these lines?

Gary Rosin

LSATs by Majors, 2007-2008

August 31st, 2009

In a working paper on SSRN, LSAT Scores of Economics Majors: The 2008-2009 Class Update, Michael Nieswiadomy (North Texas, Economics) looks at average 2007-2008 LSAT scores by undergraduate major.  The table below is from Table 2 of the paper (p. 6), which lists a total of 29 majors with at least 450 LSAT takers in 2007-2008:

Top 10 Majors by 2007-2008 LSAT Average


Rank

Major
Average
Score
No. of
Students
1 Physics/Math 160.0 577
2 Economics 157.4 3,047
3 Philosophy/Theology 157.4 2,581
4 International Relations 156.5 1,520
5 Engineering 156.2 2,197
6 Government/Service 156.1 578
7 Chemistry 156.1 632
8 History 155.9 4,169
9 Interdisciplinary Studies 155.5 652
10 Foreign Languages 155.3 1,084

Note:  Majors with at least 450 takers.

The paper does not tell us whether (or when) the differences in means are statistically significant.  In terms of practical significance, is the average engineeering major (156.2) that much stronger than the average Government/Service or Chemistry major (156.1)? 

In terms of getting into law school, consider the distribution of the 75th and 25th LSAT percentiles of the Fall 2008 entering law-school classes (as reported in the 2010 Official Guide).  An average Physics/Math major, with a 160.0, would have fallen in the top

  • quarter (75th percentile) at 60% of law schools, and
  • three quarters (25th percentile) at 85% of law-schools.  

Certainly, the average Physics/Math student has a good chance of getting into law school, but not necessarily one of the top 50 law schools. 

It also would be interesting to know how widely the scores vary within each major (the standard deviation).

Gary Rosin

 

Journal Reputation and Moving Up

August 28th, 2009

In Signaling Value of Law Reviews, I noted an article by Al Brophy (North Carolina) cautioning that scholarship should be judged on its own merits.  Paul Caron notes an empirical study of “the theory of cumulative advantage in science (Matthew Effect),” that controls for article quality.  In The Impact Factor’s Matthew Effect:  A Natural Experiment in Bibliometrics, Vincent Larivière & Yves Gingras conclude

The intrinsic value of a paper is thus not the only reason a given paper gets cited or not; there is a specific Matthew effect attached to journals and this gives to paper published there an added value over and above their intrinsic quality.

So, it’s not just a matter of the quality of the paper, but also of its placement.  It follows that an author’s academic reputation is also enhanced by placement.  Thus, the urge to “trade-up” in placement of articles.  Presumably, a law school’s peer reputation follows (with a lag?) that of its faculty.  Earlier, I discussed Jeff Lipshaw’s (Suffolk) thoughts on the penchant of ambitious young professors to “move up the food chain to a law school with a higher ranking.

Given the strong influence of peer reputation in the US News law-school rankings, should lower-tier law-schools try to move up in the rankings by using pay-for-placement bonuses to young professors that might be just moving through?  Or does their ability to do that–or their earlier association with a school–also enhance the peer reputation of that school?

Gary Rosin

Teaching and Law School

August 24th, 2009

Over on TaxProf, Paul Caron notes in What’s Wrong with Law Schoola comment by Dean Erwin Chemerinsky (UC-Irvine) that his professors at Harvard weren’t interested in their students.  Caron notes that the faculty at his son’s college voted down a proposal to reduce the teaching load, and wonders if has ever done that.

With the  talk about alternative outcomes and assessment measures, it will be interesting to watch what happens.  Will those be limited to clinics, or will all of the law professoriate have to start worrying about whether students are actually learning?

Gary Rosin

The Signaling Value of Law Reviews

August 24th, 2009

I haven’t looked at this yet, but this new article looks interesting:  Alfred L. Brophy, The Signaling Value of Law Reviews:  An Exploration of Citations and Prestige,  36 Fla. St. U. L. Rev. 229-243 (2009)(SSRN).  His conclusion is that

the results here suggest that we should we wary of judgments about quality based on place of publication. We should also be wary of judgments about quality of scholarship based on number of citations and we should, therefore, continue to evaluate scholarship through close reads of it.

Gary Rosin

Components of US News Law-School Rankings (II)

August 24th, 2009

Tom Bell (Chapman) has another installment of his the components of US News law-school rankings: How Top-Ranked Law Schools Got That Way, Pt. 2.  In this installments, he stacks the components of the top 22 schools in a bar-graph.  Stip to come:  top 41-51, top 94-100 and bottom 8.

Gary Rosin

Ambition and Rankings?

August 21st, 2009

Jeff Lipshaw has an interesting comment on the rankings game, Ambition and Rankings:  “We Have Met the Enemy and He Is Us”.  He begins by noting a WSJ article by Eric Felten, who is of the view that “the rankings are really about getting ahead.”  He then notes an discussion with a colleague about the Big Law School game–moving up to more prestigious law schools.  He continues

Yes, I think the rankings do have something to do with our subjective views of getting ahead, and I do think there’s something about the legal profession that makes OUR rankings so powerful.  I used the phrase “progressing up the food chain” with my colleague, and in what industries or professions is the food chain as quantitative as the legal profession?  * * *

* * *

… there’s a lot of self-selection in the process of becoming a lawyer, and even more in becoming a big law firm lawyer or a law professor.  I suspect the first element of that self-selection is a particular orientation to progressing up the food chain….  There ain’t that much to distinguish us…. There are only dozens and not thousands of law schools.  * * * In other words, it’s easy to see a well-defined food chain in the relatively small, homogeneous, and closed legal community.  

Most of the blogging about law-school rankings focuses on the top law schools, and sometimes as far down as the top 100.  Perhaps that’s because they are the only schools individually ranked, but I don’t think so.  I’m not sure that professors at top law schools really care about what happens on the other side of the Great Divide in the legal academy (Tiers 3 and 4).  If nothing else, the concerns of the lower-ranked schools are not the concerns of the elite. 

For example, during the debate about ABA Interpretation 301-6 and minimum law-school Bar passage standards, the blawgosphere was largely (entirely?) silent.  Was that because the elite law schools, and even the top 100, don’t worry about the Bar?  Yes, the occasional Top 100 Dean gets toppled when Bar passage rates slip.  But the top law schools don’t measure themselves by the proportion of the graduates that can meet the minimum standards to be come a lawyer.  That’s taken as a given.

Gary Rosin

Components of the 2010 US News Rankings of the Top 100 Law Schools

August 20th, 2009

The official (as opposed to the leaked2010 US News Law School rankings came out today.  Over at MoneyLaw, Tom Bell has an interesting post, How Top-Ranked Law Schools Got That Way, Pt. I. He looks at the weighted standardized scores on each of the 12 components of the overall score.  He then compares the amounts by which the component scores vary among the top 100, and the top 12, law schools.  As expected, the peer reputation scores (PeerRep) vary (and count) the most.  The surprising result is that the second highest variation is in overall expenditures per student (Over$/Stu):

[T]he Over$/Stu z-scores range quite widely, with Yale having more than double the score of all but two schools, Harvard and Stanford, which themselves manage less than two-thirds Yale’s Over$/Stu score. That wide spread gives the Over$/Stu score an especially powerful influence on Yale’s overall score, making it almost as important as Yale’s PeerRep score and much more important than any of the school’s remaining 10 z-scores. In effect, Yale’s extraordinary expenditures per student buy it a tenured slot at number one.

If would be interesting to see the relative component contributions for Tiers 3 and 4, as well as Tiers 2 and 3.

Gary Rosin

Vault Releases 2010 Top 100 Law-Firm Rankings

August 19th, 2009

Vault has released its Top 100 Law Firms rankings.  Paul Caron has posted the top 25 on TaxProf.  The Vault rankings are based on a survey of associates about law-firm prestige.

Gary Rosin