Archive for the ‘Rankings’ Category

Go To Law

Wednesday, March 2nd, 2011

The National Law Journal Just released their annual Go-To Law Schools Report on hiring at firms in the NLJ 250 during 2010.  The report the top 50 law schools, based on success in placing class of 2010 graduates in NLJ 250 firms.  Two thing jumped out at me.  First, big-firm placement rates dropped off quickly from the top school, University of Chicago Law School, with 58.97% of its class of 2010,  to the 50th-ranked school, Washington and Lee University Law School, with 10.85%.  Second, two schools outside the top-50 law schools in the 2010 US News rankings, but located in or near major metropolitan areas, landed in the top 50 in the Go-To rankings:  Rutgers School of Law–Newark (Tier 2) and Howard University School of Law (Tier 3), which ranked 49th and 31st, respectively.

posted by Gary Rosin

GPAs and Standardized Test Abuse

Friday, October 23rd, 2009

An article by Scott Jaschik, More Testing, Less Logic? (Inside Higher Ed) comments on an article by Anne VanderMey,  GMAT:  The MBA Job Seeker’s Best Friend (Business Week).  VanderMey reports on a disturbing trend in the MBA job market:

For a select group of companies, mostly top consulting, finance, and banking firms, employers routinely look to MBA graduates’ GMAT scores as a reliable standard measurement of academic prowess…. Particularly when jobs are tight, and every element of each résumé takes on added weight, test scores can be the difference between an interview and the dustbin.

According to both VanderMey and Jaschik, some schools are advising students to retake the GMAT. 

VanderMey observes that while employers looking for people to do “heavy quantitative lifting” find the quantitative portion of the GMAT useful, the real problem is that GPAs are not always useful:

Many MBA programs have grading systems that vary widely or are solely pass-fail, making it difficult for recruiters to compare applicants from different schools, and others don’t provide grades at all. Even at schools where grades are released, grade inflation may render As and Bs poor markers for actual skill. The tests can be a boon by virtue of their standardization….

The problem with GPAs is that they are not objective measures of performance.  Rather, they just sort each cohort of admitted students.  The strength B-school cohorts vary from school to school, and even from year to year. 

Jaschik suggests that the problem is more acute at lower-ranked B-schools:

At the very top ranked business schools, … “companies assume that everyone there is strong, and don’t care about their scores as much. McKinsey or Goldman Sachs is going to hire 20-30 grads from there every year.” But … at other business schools, “where Goldman may only hand out a few job offers, they’ll look more carefully at everything in a student’s profile (including the GMAT) to determine who the lucky few will be. That’s not a knock on those lower-ranked schools; I think it’s just the reality of the situation.” (quoting Scott Shrum, director of admission consulting research at Veritas Prep, a “high end GMAT test-prep company”).

Jaschik’s main focus is on the use of GMAT scores for purposes other than as a guide to first-year MBA grades.  He argues that testing companies, such as the GMAC (presumably, the Graduate Management Admissions Council), should more actively resist the use of test scores for purposes other than admission.

U.S. News uses LSAT scores of entering classes as one of the factors in its rankings of law schools.  The LSAC and the ABA also report the LSAT profiles of entering classes in their annual Office Guide to ABA-Approved Law Schools.  Of course, they also provide a wealth of additional data about each law school.

Has anyone heard of law firms and other employers of lawyers using LSAT scores in evaluating job applicants?

posted by Gary Rosin

Part 5 the Legal Education at the Crossroads conference

Thursday, September 24th, 2009

The Big News from the Conference on Assessment:  Steve Bahls, Chair of the Student Learning Outcomes Subcommittee of the American Bar Association Section of Legal Education and Admission to the Bar’s Standards Review Committee, presented the draft of the new Standards on assessment.  From his presentation, it sounds as if some form of these Standards will be recommended by the ABA. 

Where do the new Standards take us?  First, the ABA, fortunately in my view, is not taking an extreme position.  The proposed Standards would require that all schools do some assessment of certain required competencies, such as “legal analysis and reasoning, legal research, problem solving, written and oral communication in a legal context.”  Beyond that, each school is required to identify additional learning outcomes based upon its own mission.  So, the ABA appears to be seeking to preserve a good degree of law school autonomy.

The real sea change comes, however, from the requirement that each school must “employ a variety of valid and reliable measures systematically and sequentially throughout the course of the students’ studies.”  Thus, a school simply will not be able to use a single summative final examination in the future, at least not in all its courses.  This is no doubt a good thing, but it will involve a huge change in how we teach.

Jeff Rensberger

Part 4 the Legal Education at the Crossroads conference

Thursday, September 24th, 2009

One key group missing from the Conference was Deans.  I would have loved to hear from some Deans on how they would implement broad-based assessment when they are the same time trying to manage budgets, get their faculty to write more, and improve their school’s US News ranking.  As to the latter, does one gain anything at all in US News rankings by having a state of the art assessment regime?  There is a huge issue of aligning what should be the prime goal of law schools–legal education–with other institutional imperatives, some of which, like US News, are imposed from without.

Jeff Rensberger

Part 3 the Legal Education at the Crossroads conference

Thursday, September 24th, 2009

So, the Big Question is how does one perform a meaningful assessment in a large doctrinal class of, say, 90 students?  One of the most cogent remarks of the Conference was David Thompson’s observation that for assessment to penetrate deeply into law school classrooms, it must be made “dumb easy.”  Methods that work in small group settings do not easily transform to a larger group unless a huge investment is to be made in additional teaching resources.

Long before the ABA’s interest in assessment, I wondered, like many doctrinal professors, what exactly is the reason I get away with giving only a single final exam for a course, with no quizzes and no mid-terms.  The answer I came up with, which I think is sound, is this:  Law schools and students strike a deal.  Students forego the more regularized feedback and  assessment present in most educational settings in exchange for getting a full professor and no teaching assistants.  One obvious way to make assessment work in a large doctrinal class is to farm it out to TAs.  But that breaks the bargain traditionally struck.  So, other than through TAs, how do we do assess in large classes?  If this is to occur, it is going to either change the historic bargain or involve the magic genies of technology.   And there are some cost and time-effective means of assessing through technology such as on-line quizzes and audience response software.  But nothing is free.  If there is a cheap way to assess, it is probably less effective as a means of assessment than a costly and time-consuming one.

Jeff Rensberger

Components of US News 2010 Rankings (III)

Monday, August 31st, 2009

Tom Bell has posted the third installment of How Top-Ranked Law Schools Got that Way.  In this post, he focuses on schools ranked 41-51, 94-100, and the bottom 8 (according to his model).  What I would liked to have seen, instead, are schools on either side of the major breaks:  2oth, 50th, 100th, Tier 3/Tier 4. If peer reputation is the biggest influence on rankings, what moved schools above these lines?

Gary Rosin

Journal Reputation and Moving Up

Friday, August 28th, 2009

In Signaling Value of Law Reviews, I noted an article by Al Brophy (North Carolina) cautioning that scholarship should be judged on its own merits.  Paul Caron notes an empirical study of “the theory of cumulative advantage in science (Matthew Effect),” that controls for article quality.  In The Impact Factor’s Matthew Effect:  A Natural Experiment in Bibliometrics, Vincent Larivière & Yves Gingras conclude

The intrinsic value of a paper is thus not the only reason a given paper gets cited or not; there is a specific Matthew effect attached to journals and this gives to paper published there an added value over and above their intrinsic quality.

So, it’s not just a matter of the quality of the paper, but also of its placement.  It follows that an author’s academic reputation is also enhanced by placement.  Thus, the urge to “trade-up” in placement of articles.  Presumably, a law school’s peer reputation follows (with a lag?) that of its faculty.  Earlier, I discussed Jeff Lipshaw’s (Suffolk) thoughts on the penchant of ambitious young professors to “move up the food chain to a law school with a higher ranking.

Given the strong influence of peer reputation in the US News law-school rankings, should lower-tier law-schools try to move up in the rankings by using pay-for-placement bonuses to young professors that might be just moving through?  Or does their ability to do that–or their earlier association with a school–also enhance the peer reputation of that school?

Gary Rosin

Ambition and Rankings?

Friday, August 21st, 2009

Jeff Lipshaw has an interesting comment on the rankings game, Ambition and Rankings:  “We Have Met the Enemy and He Is Us”.  He begins by noting a WSJ article by Eric Felten, who is of the view that “the rankings are really about getting ahead.”  He then notes an discussion with a colleague about the Big Law School game–moving up to more prestigious law schools.  He continues

Yes, I think the rankings do have something to do with our subjective views of getting ahead, and I do think there’s something about the legal profession that makes OUR rankings so powerful.  I used the phrase “progressing up the food chain” with my colleague, and in what industries or professions is the food chain as quantitative as the legal profession?  * * *

* * *

… there’s a lot of self-selection in the process of becoming a lawyer, and even more in becoming a big law firm lawyer or a law professor.  I suspect the first element of that self-selection is a particular orientation to progressing up the food chain….  There ain’t that much to distinguish us…. There are only dozens and not thousands of law schools.  * * * In other words, it’s easy to see a well-defined food chain in the relatively small, homogeneous, and closed legal community.  

Most of the blogging about law-school rankings focuses on the top law schools, and sometimes as far down as the top 100.  Perhaps that’s because they are the only schools individually ranked, but I don’t think so.  I’m not sure that professors at top law schools really care about what happens on the other side of the Great Divide in the legal academy (Tiers 3 and 4).  If nothing else, the concerns of the lower-ranked schools are not the concerns of the elite. 

For example, during the debate about ABA Interpretation 301-6 and minimum law-school Bar passage standards, the blawgosphere was largely (entirely?) silent.  Was that because the elite law schools, and even the top 100, don’t worry about the Bar?  Yes, the occasional Top 100 Dean gets toppled when Bar passage rates slip.  But the top law schools don’t measure themselves by the proportion of the graduates that can meet the minimum standards to be come a lawyer.  That’s taken as a given.

Gary Rosin

Components of the 2010 US News Rankings of the Top 100 Law Schools

Thursday, August 20th, 2009

The official (as opposed to the leaked2010 US News Law School rankings came out today.  Over at MoneyLaw, Tom Bell has an interesting post, How Top-Ranked Law Schools Got That Way, Pt. I. He looks at the weighted standardized scores on each of the 12 components of the overall score.  He then compares the amounts by which the component scores vary among the top 100, and the top 12, law schools.  As expected, the peer reputation scores (PeerRep) vary (and count) the most.  The surprising result is that the second highest variation is in overall expenditures per student (Over$/Stu):

[T]he Over$/Stu z-scores range quite widely, with Yale having more than double the score of all but two schools, Harvard and Stanford, which themselves manage less than two-thirds Yale’s Over$/Stu score. That wide spread gives the Over$/Stu score an especially powerful influence on Yale’s overall score, making it almost as important as Yale’s PeerRep score and much more important than any of the school’s remaining 10 z-scores. In effect, Yale’s extraordinary expenditures per student buy it a tenured slot at number one.

If would be interesting to see the relative component contributions for Tiers 3 and 4, as well as Tiers 2 and 3.

Gary Rosin

Vault Releases 2010 Top 100 Law-Firm Rankings

Wednesday, August 19th, 2009

Vault has released its Top 100 Law Firms rankings.  Paul Caron has posted the top 25 on TaxProf.  The Vault rankings are based on a survey of associates about law-firm prestige.

Gary Rosin