How did more than 30 law schools survive class actions concerning allegations of misrepresentations in employment statistics?

The third in an occasional series I call “dire predictions.”

March 2012 was a turning point for law schools. 14 law schools were facing consumer protection class actions on allegations that they misrepresented their employment statistics, deceiving prospective law students and current law students on the value proposition of a legal degree.

From a feature in February 2012 in New York Magazine’s “Intelligencer”:

“We believe that some in the legal academy have done a disservice to the profession and the nation by saddling tens of thousands of young lawyers with massive debt for a degree worth far less than advertised,” David Anziska, wrote in a statement today. “[I]t is time for the schools to take responsibility, provide compensation and commit to transparency. These lawsuits are only the beginning.”

So do Anziska, Strauss, and Raimond actually have a shot at making these lawsuits stick? Well, yes, says Paul Campos, a professor at the University of Colorado Law School — particularly if the law schools are compelled to turn over their internal job placement data, which could prove so embarrassing that the law schools would decide to settle with the plaintiffs.

From a lengthy feature in March 2012 by the same author in New York Magazine:

It’s not yet clear whether the lawyers have proof that NYLS and the other defendants are cooking their numbers. What they do have is at least one favorable precedent: Last year, San Francisco’s California Culinary Academy was sued for misleading applicants about their chances of landing gainful employment in the gastronomic arts, leading to a settlement under which the school reportedly issued tuition refunds to as many as 8,000 students. Anziska, Raimond, and Strauss hope to use the discovery process to compel their targets to turn over all their internal data on their graduates’ livelihoods and to see how that data squares with the claims posted on websites and in recruiting literature—or, barring that, to show that the schools aren’t really trying to keep complete, accurate figures to begin with.

“In that case, the schools will have to disclose a lot of potentially embarrassing information,” predicts University of Colorado law professor Paul Campos, a prominent skeptic of law schools’ self-reported placement numbers. That is, if the schools don’t cut deals to make their cases go away. As you learn in Intro to Civil Procedure, lawyers can win without going all the way to trial.

As of March 15, 2012, the plaintiffs announced they’d file claims against “20 more law schools,” in addition to the first 14.

But that never happened. Those 20 schools were never sued.

And by March 21, 2012, a decidedly different result came from a New York court:

A state judge on Wednesday threw out a class-action lawsuit against New York Law School, one of the first of 15 schools hauled into court for allegedly inflating their job-placement and salary statistics to attract applicants.

The sweeping ruling, which could have an effect on the 14 other lawsuits filed since last summer and 20 others that have since been threatened, was issued by Judge Melvin L. Schweitzer, of the Supreme Court of the State of New York (a trial-level court in spite of its name).

In a 36-page ruling, the judge found that the plaintiffs had failed to prove that the law school had misled them "in a material way." Judge Schweitzer also said applicants to New York Law School had plenty of information available to them about their realistic chances of getting a job.

Courts continued to reject these claims. None of these lawsuits “stuck.”

Winning the press release (i.e., the filing of a complaint) is quite different from getting to the merits—much less surviving the motion to dismiss stage. And it turns out that legal education is quite different as a value proposition than, say, culinary school.

That’s not to say all schools made it out of the recession unscathed. Many closed. But losses in the class action domain never materialized.

Projecting the 2025-2026 USNWR law school rankings (to be released March 2025 or so)

Fifty-eight percent of the new USNWR law school rankings turn on three highly-volatile categories: employment 10 months after graduation, first-time bar passage, and ultimate bar passage. USNWR has tried to smooth these out by using a two-year average of these scores. (Next year, it might well use a three-year average or three-year weighted average.)

Because USNWR releases its rankings in the spring, at the same time the ABA releases new data on these categories, the USNWR law school rankings are always a year behind. This year’s data include the ultimate bar passage rate for the Classes of 2019 and 2020, the first-time bar passage rate for the Classes of 2021 and 2022, and the employment outcomes of the Classes of 2021 and 2022

We can quickly update all that data with this year’s data (as I made an effort to do, with some modest success, early last year). And given that the other 42% of the rankings are much less volatile, we can simply assume this year’s data for next year’s and have, within a couple of ranking slots or so, a very good idea of where law schools will be. (Of course, USNWR is free to tweak its methodology once again next year. Some volatility makes sense, because it reflects responsiveness to new data and changed conditions; too much volatility tends to undermine the credibility of the rankings as it would point toward arbitrary criteria and weights that do not meaningfully reflect changes at schools year over year.) Some schools, of course, will see significant changes to LSAT medians, UGPA medians, student-faculty ratios, and so on relative to peers. Some schools have significantly increased school-funded positions after the change in USNWR methodology. And the peer scores may be slightly more volatile than years. Likewise, lawyer and judge scoring of law schools appears to be more significantly adversely affecting the most elite law schools, and that trend may continue.

But, again, this is a first, rough cut of what the new (and volatile) methodology may yield. High volatility and compression mean bigger swings in any given year. Additionally, it means that smaller classes are more susceptible to larger swings (e.g., a couple of graduates whose bar or employment outcomes change are more likely to change the school’s position than larger schools).

If you are inclined to ask, “How could school X move up/down so much?” the answer is, bar and employment, bar and employment, bar and employment.

Here’s the early projections. (Where there are ties, they are sorted by score, which is not reported here.)

UPDATE: I continue to have difficulty assessing Wisconsin’s two law schools due to diploma privilege and how USNWR purports to measure bar passage statistics, so their rankings may be lower than would be expected.

School Projected Rank This Year's Rank
Stanford 1 1
Chicago 2 3
Yale 3 1
Virginia 3 4
Penn 5 4
Harvard 5 4
Michigan 7 9
Duke 7 4
Northwestern 9 9
Columbia 9 8
NYU 9 9
UCLA 12 13
Berkeley 13 12
Vanderbilt 14 19
Washington Univ. 14 16
Georgetown 14 14
Texas 14 16
North Carolina 18 20
Cornell 18 14
Notre Dame 20 20
Minnesota 21 16
Boston Univ. 22 24
Wake Forest 22 25
Georgia 24 20
USC 24 20
Texas A&M 24 26
Boston College 27 28
Florida 28 28
William & Mary 29 36
Alabama 29 33
Ohio State 29 26
George Mason 29 28
BYU 33 28
Washington & Lee 33 33
Utah 33 28
Irvine 33 42
Florida State 37 48
Iowa 37 36
George Washington 37 41
Emory 40 42
Baylor 40 46
Fordham 40 33
SMU 43 42
Arizona State 43 36
Wisconsin 45 36
Illinois 45 36
Colorado 45 48
Indiana-Bloomington 48 42
Villanova 48 48
Davis 48 55
Connecticut 48 55
Pepperdine 52 52
Kansas 52 46
Washington 52 48
Temple 52 54
Tennessee 56 52
San Diego 56 68
Missouri 58 61
Penn State Law 58 68
Arizona 58 55
Penn State-Dickinson 58 75
Oklahoma 58 55
Maryland 63 55
Wayne State 63 55
Kentucky 65 61
Loyola-Los Angeles 65 61
Pitt 65 91
Houston 65 68
Cardozo 65 61
South Carolina 65 66
UNLV 71 78
Cincinnati 71 78
St. John's 71 68
Tulane 71 78
Seton Hall 71 61
Nebraska 71 82
Catholic 71 94
Northeastern 71 68
Florida International 71 68
Richmond 80 66
LSU 80 91
Drexel 80 75
Georgia State 80 75
Maine 84 120
Loyola-Chicago 84 78
Belmont 86 91
Marquette 86 68
Texas Tech 88 82
Miami 88 82
Denver 88 89
UC Law-SF 88 82
Drake 92 82
Duquesne 92 94
Stetson 92 98
Lewis & Clark 95 82
Oregon 95 82
St. Louis 95 94
Chapman 98 108
American 98 98
Buffalo 98 108
Dayton 98 108
Rutgers 98 103
This content was stolen from ExcessOfDemocracy.com

(Any mistakes are my own. One data collection note. I often transpose some schools due to inconsistencies in how the ABA reports school names. Schools beginning with Chicago, Saint, South, or Widener are most susceptible to these inconsistencies.)

Will an earlier big law firm recruiting calendar change the market for prospective law school transfer students?

Back in 2018, the National Association for Law Placement loosened some of its calendar and deadlines for on-campus recruiting for law schools and law students. The concern was largely antitrust, coordinating behavior from large law firms that could affect the labor market. Law firms continued the inertia from early practices, but they also began to move recruiting earlier: on-campus interviews (OCI) still happened in fall of 2L, but some were moving to August or flirting with summer dates. The Covid-19 pandemic in 2020 helped accelerate the move: rather than investing in laborious and time-intensive OCI, screening interviews could happen quickly over Zoom, and could be pushed earlier into the summer.

This inevitable unraveling continues. Bloomberg reports that some OCI is moving to spring 1L year. The bulk of OCI will be be complete at many firms by July 1, although many will still have some spots available for later placement, but surely a minority of spots.

This is, on the whole, bad for law students, as one semester of grades, minimal writing samples, no appearance on journal, and the like make for thin resumes. Employers likely will increasingly rely on proxies like undergraduate institutions and undergraduate grades. Relatedly, it can create additional pressure for first generation law students, who may not be as attuned to how early the law firm hiring process takes place and might miss opportunities that students with attorneys in the family might know about. It puts pressure on schools to have additional education and career development awareness (perhaps with more such staff) for students.

But an interesting Reddit thread raised a different concern. How does this change in market affect transfer students?

One of the big perks of students transferring “up,” if you will, is taking advantage of the new school’s OCI. If your new school has more robust OCI opportunities for the fall of 2L year, it redounds to your benefit to transfer and take advantage of them immediately. Students often give up significant scholarships at lower-ranked schools to take on significant debt at higher-ranked schools. Part of that tradeoff is the benefits of OCI.

But what if OCI moves to spring of 1L or that early summer of 1L year—well before transfer applications are accepted and completed? The benefits of OCI would seem to be lost—as would a major reason to take on additional debt, switch schools (and sometimes moving states), and transfer.

To be sure, there are other benefits of a school—the alumni network, the reputation benefits, and so on. There are many reasons a transfer might be deemed beneficial. But if one of the major reasons for transferring disappears, I wonder if we might see a change in student behavior. And for schools that have previously heavily relied on transfer students for budgetary purposes or to keep 1L admissions classes look a certain way for LSAT and UGPA medians, it could be quite disruptive.

This might be why more schools are moving to early transfer applications, too. If schools realize that the benefits for transferring students are moving earlier, they need to incentivize students to apply earlier, accept them earlier, and give them the potential benefits earlier.

It might also be why we see declining transfer applications overall as well. But there are many market forces at work (a good economy for a few years makes the urgency for transferring less, I would assume), and it’s possible this changes as we see softening recruitment.

It’s one interesting relationship between two things I hadn’t thought much about (the moving OCI window and the transfer market), and one I’ll be watching in the years ahead.

One-point increase in student loan rates could cost new law school graduates tens of thousands of dollars in added debt

CNBC reports the new student loan interest rate figures, and they are pretty dire for higher education in general and law schools in particular:

For graduate students, loans will probably come with a 8% interest rate, compared with 7% now, he said.

Plus loans for graduate students and parents may have a 9% interest rate, an increase from 8%.

From a simple student loan calculator, we can make some estimates on debt and repayment. $150,000 in student loans ($50,000 per year) at 7% interest results in around $90,000 in total interest. That jumps to $105,000 in total interest if the rate is 8%. That’s an extra $15,000 in interest (and debt), hidden from students at the outset of the loan, and that does not redound to the benefit of the law school.

On $75,000 per year in student loans ($225,000 in total), interest jumps from $135,000 to $158,000, an increase of $23,000.

Even more modest debt, like $25,000 per year ($75,000 in total) sees interest jump to from $45,000 to $53,000, although $8000 is much more manageable increase.

(These figures of course are exacerbated by another hidden cost, the ending of subsidized graduate student loans in 2011, which allows interest to accrue during law school.)

Student debt loads reported by the Department of Education can factor in these interest figures, and they can be helpful in assessing which programs see some of the highest debt loads upon graduation (and shortly after graduation), along with salary data.

But it makes a robust economy (as big law firm salaries are $215,000 to start and big law hiring remains high, for now, but is on the down slope), and a school’s robust loan repayment assistance program (LRAP), all the more important for legal education. But as a hidden cost, it requires some foresight from law schools to anticipate and prepare for the challenges ahead.

Which law schools saw the biggest changes in employment placement after USNWR gave "full weight" to new categories of jobs?

Back in 2016, I noted how a lot of law school-funded positions “dried up” once USNWR stopped giving those jobs “full weight” in its law school rankings. Yes, correlation does not equal causation. And yes, there were other contributing causes (e.g., changes in how the ABA required reporting of such positions). But the trend was stark and the timing noteworthy.

The trend is likewise stark, at least in one category.

USNWR is now giving weight to full-time, long-term (1 year or longer) bar passage-required and J.D. advantage jobs funded by law schools. It is also giving weight to those enrolled full time in a graduate degree program.

In two categories, graduate degree and bar passage-required, there were not significant variances from previous years. For bar passage-required jobs, that is perhaps understandable. Such positions have hovered between 200 and 300 for several years (239 last year, 212 the year before), and they are really driven by a handful of schools that can sustain a kind of “bridge” program for students interest in public interest work.

For graduate degree, it actually hit an all-time low since reporting began—just 344, down from 375 last year, and down from the record 1231 for the Class of 2010. I had thought this might be a tempting position for schools to press students into to give them “full weight” positions for USNWR purposes. Not so.

But the one category that did stand out was J.D. advantage jobs funded by the school. Here, again, we are in an incredible small category of jobs—just 97 for the Class of 2023, only one quarter of one percent of all jobs. (And again, it’s worth noting, even though these three categories combine for less than 700 graduates among 35,000 graduates, it was one of the leading charges of the pro-”boycott” law schools.) But there is a marked uptick, returning to a pre-2015 high.

We also know that not all these jobs are randomly distributed. They can be concentrated at some schools. We can also try to identify if some schools saw a marked rise in these three categories of jobs last year. But of course, there can be volatility from year to year in any particular category.

I looked at the 2020, 2021, and 2022 average of law schools’ output into these three categories of previously-lesser weight employment outcomes. I then compared to see how the placement in the Class of 2023 compared to the previous three-year average in these combined categories. The top 15 schools are listed below.

Employment placement in full time, long term, bar passage required or JD advantage jobs funded by the school or in graduate degree programs
SchoolName 2020-2022 avg 2023 Delta
PEPPERDINE UNIVERSITY 2.2% 8.2% 6.0
WASHINGTON UNIVERSITY 0.6% 4.8% 4.2
CATHOLIC UNIVERSITY OF AMERICA 0.0% 3.7% 3.7
ARKANSAS, LITTLE ROCK, UNIVERSITY OF 2.2% 5.7% 3.5
YALE UNIVERSITY 11.7% 15.2% 3.4
WASHBURN UNIVERSITY 0.0% 3.3% 3.3
GEORGE MASON UNIVERSITY 1.2% 4.3% 3.1
SOUTH DAKOTA, UNIVERSITY OF 0.4% 3.4% 3.0
CORNELL UNIVERSITY 1.9% 4.6% 2.7
DUQUESNE UNIVERSITY 0.3% 2.9% 2.7
ARKANSAS, FAYETTEVILLE, UNIVERSITY OF 3.3% 5.9% 2.6
MISSISSIPPI, UNIVERSITY OF 2.5% 5.0% 2.4
LIBERTY UNIVERSITY 1.0% 3.3% 2.3
UNIVERSITY OF BUFFALO-SUNY 0.3% 2.5% 2.3
WISCONSIN, UNIVERSITY OF 1.2% 3.5% 2.3

As I wrote back in 2016, correlation is not causation, and there are of course confounding variables and factors in place at any given institution. But there’s no question the change in “full weight” categories by USNWR comes at a time when some schools are undergoing material changes to their typical employment outcomes in categories that previously did not receive “full weight” but now do. And while many of these figures appear to be small changes, we know that very small changes in the new methodology can yield big differences: “By shifting about 3 percentage points of a class from “unemployed” to a “full weight” job (in a school of 200, that’s 6 students), a school can move from being ranked about 100 in that category to 50.” (Note: this effect is somewhat diluted as it is a two-year employment average, but if the same thing happens year over year, the effects will remain the same.)

As the law firm hiring market slows down, I’ll be watching the overall trends and the individual trends for the Class of 2024 in particular.

Overall legal employment for the Class of 2023 improves slightly, with large law firm and public interest placement growing

That is literally the same headline I had for the class of 2022, but it’s another year of incremental improvement. Outcomes improved incrementally once again. Below are figures for the ABA-disclosed data (excluding Puerto Rico’s three law schools). These are ten-month figures from March 15, 2024 for the Class of 2023.

  Graduates FTLT BPR Placement FTLT JDA
Class of 2012 45,751 25,503 55.7% 4,218
Class of 2013 46,112 25,787 55.9% 4,550
Class of 2014 43,195 25,348 58.7% 4,774
Class of 2015 40,205 23,895 59.4% 4,416
Class of 2016 36,654 22,874 62.4% 3,948
Class of 2017 34,428 23,078 67.0% 3,121
Class of 2018 33,633 23,314 69.3% 3,123
Class of 2019 33,462 24,409 72.9% 2,799
Class of 2020 33,926 24,006 70.8% 2,514
Class of 2021 35,310 26,423 74.8% 3,056
Class of 2022 35,638 27,607 77.5% 2,734
Class of 2023 34,848 27,828 79.9% 2,167

Placement continues to be very good. There was an increase of over a few hundred full-time, long-term bar passage-required jobs year-over-year, and the graduating class size was dipped a bit. Those factors combined for a placement rate of 79.9%. J.D. advantage jobs decreased somewhat, perhaps consistent with a hot law firm market last year.

It’s remarkable to compare the placement rates from the Class of 2012 to the present, from 56% to 80%. And it’s largely attributable to the decline in class size.

Here’s some comparison of the year-over-year categories.

FTLT Class of 2022 Class of 2023 Net Delta
Solo 160 174 14 8.7%
2-10 5,070 4,751 -319 -6.3%
11-25 2,115 2,047 -68 -3.2%
26-50 1,360 1,340 -20 -1.5%
51-100 1,175 1,157 -18 -1.5%
101-205 1,246 1,234 -12 -1.0%
251-500 1,145 1,223 78 6.8%
501+ 6,137 6,360 223 3.6%
Business/Industry 2,797 2,236 -561 -20.1%
Government 3,591 3,766 175 4.9%
Public Interest 2,875 2,991 116 4.0%
Federal Clerk 1,130 1,182 52 4.6%
State Clerk 2,053 2,067 14 0.7%
Academia/Education 375 367 -8 -2.1%

The trend continues a longstanding uptick in public interest placement, which is not an outlier. Public interest job placement is up over 100% since the Class of 2017. These eye-popping number continue to rise. It is likely not an understatement to say that law students are increasingly oriented toward public interest, and that there are ample funding opportunities in public interest work to sustain these graduates. (I include a visualization of the trend of raw placement into these jobs here.)

Sole practitioners continue to slide significantly (they were in the low 300s not long ago in raw placement).

Additionally, large law firm placement continues to boom. Placement is up more than thousands graduates in the last several years. Placement in firms with at least 101 attorneys is around 8800. A full 25% of all law school graduates landed in a “Big Law” firm, and more than 30% of those who were employed in a full-time, long-term, bar passage-required job landed in a “Big Law” firm. (Charts showing both the raw placement and the percentage of graduates are included here in another visualization (charting both on two different axes to see similar trends).

How did law schools and law firms survive the purported “death of Big Law”? Well, Big Law seems to be doing better than ever. It’s not clear we’ll have as hot a market for this year’s class, but it’s something to watch.

One slightly interesting observation is a sharp decline in “business” placement. These tend to be JD advantage positions, and if there’s a decline in JD advantage placement we’d expect to see a decline here, too, but it seems more significant than just JD advantage jobs.

Federal clerkship placement improved a bit but has remained mostly steady.

There’s a lot more to examine in light of USNWR methodology changes, but I’ll save that for another post.

The 2024-2025 USNWR law school rankings: methodology tweaks may help entrench elite schools, but elite schools see reputation decline among lawyers and judges

Hours after the release of last year’s dramatic change to the USNWR methodology, I noted the dramatic increase in “compression and volatility” in the coming rankings.

USNWR changed a couple of things in its methodology:

There were a couple differences in how the rankings were calculated, described below. In summary, U.S. News averaged its bar passage and employment indicators over two years. Also, the lawyers and judges assessment score had a second source of ratings besides names supplied by law schools.

While it might not be the design—more on that in a moment—its effect may well be to entrench elite schools.

1. Changes to employment (and bar passage)

USNWR decided to use two-year figures for both employment and bar passage. Here’s how it explained the employment changes.

To improve measurement of this indicator – given the common year-to-year fluctuations associated with outcome measures and the small sizes of some graduating J.D. classes – this indicator was derived from the average of the 2021 and 2022 graduating class outcomes 10 months after graduation.

This isn’t entirely true for several reasons. First, the problem of “small sizes” of classes is not the issue—and it’s an issue that’s been true for the decades that USNWR used the categories, but it never thought to include a two-year average until now. And there have always been fluctuations, again, in the decades that USNWR has used these metrics.

The issue, instead, is a problem about compression in their rankings system with the new methodology and high volatility in the categories given the most weight.

Compare this visualization of schools two years ago to last year, and where the raw schools put the top ~60 schools.

The methodology changes removed or reduced the weight of categories that created a broader spread across schools. That created the compression. Then it gave additional weight to the categories that are the most volatile. That would lead to this year’s projected more dramatic changes among schools—not just volatility, but volatility within a highly compressed rating system.

So why did USNWR decide to change this year? There are two possible explanations for this change, and, tellingly, either explanation looks bad for USNWR.

One explanation is that USNWR was simply unaware of the potential volatility in their ranking sand is responding now. That is a bad look for USNWR. It took me minutes to spot this likely problem. If it escaped their entire data team’s months-long vetting, it’s a telling concession.

The other explanation is that USNWR was aware of the potential volatility, but it took a step this year to react to reduce it. That’s a bad look, too—if it was aware of the problem, why didn’t it address the problem then? It did, after all, have all of the granular employment data in previous years. And if it was aware last year, what prompted the change this year?

The related answer to both, by the way, is that it saw something problematic in what the outputs would be, and it modified the weighing to avoid undesirable results. This is not something I have proof for, I admit. I can only infer from the actions take in response to some events of the last year.

But we saw a few schools—notably, as I pointed out, NYU and Cornell—that would disproportionately suffer under the new system. I projected NYU to slide to 11 and Cornell to 18. Instead, with a re-weighing, NYU slid only to 9, and Cornell to 14. Other schools—particularly Washington University in St. Louis, North Carolina, and Texas A&M—were projected to rise much faster. The rankings changes are designed to put a governor on moves down—or up—the rankings.

Now, it’s not possible to prove that USNWR saw that NYU and Cornell would slide much faster than they thought appropriate and changed the methodology. But I can simply point out that these arguments were raised publicly for months, and this methodological change is designed to slow down the kinds of dramatic changes that we publicly expected this year. It’s not a good look whatever the motivation was, because it reflects a lack of competence about the changes instituted last year. Relatedly, USNWR is here conceding that too much volatility is a bad thing. That is, it would prefer to see less movement (and more entrenchment) in its final product.

(The lengthening window of data is creating increasingly strange results. For instance, today’s prospective law students are considering what their employment and outcome prospects look like in 2027. The current methodology has data stretching back to the Class of 2019 (two-year average of ultimate bar passage rates for the Classes of 2019 and 2020). That said, perhaps it’s better to think of schools over a longer period of time rather than just one-year data sets each year.)

2. Added value of career development (and bar support) at law schools

Last December, I blogged, “Perhaps the most valuable legal education job in the new USNWR rankings landscape? Career development.” It that was true then, it’s essentially doubly true now.

When I looked at the dramatic opportunity for law schools to rethink how they do admissions, I highlighted how broad the spread was for employment outcomes, and how small fluctuations could effect a school’s place dramatically. (See earlier for NYU and Cornell.) The same was true, to a lesser extent, on bar passage.

Now, in pure mathematical terms, the effect of a given class’s employment output is unchanged. It was 33% of the rankings last year; it’s now (effectively) 16.5% of the rankings for each year of two years. Formally, no difference.

But, I would posit, I think employment effects have now effectively doubled.

A good year will redound to a school for two years; a bad year will need to be managed across two years. No more opportunities to rip the bandage off and move to the next year; a bad year will linger. And yes, while it receives less weight in a given year, a school is seeking to maximize the effect each time every year.

So what I said before, about career development being the most valuable job in legal education? Doubly true.

The legal profession is witnessing a slowdown in hiring. Tougher times are coming to graduating law classes in the very near future. And you don’t want to be preparing for the storm in the middle of it. Law schools should be in the process of adding to their career development offices—in fact, I’d say, as a rule of thumb, doubling the size. And if you’re not… well, I hate to use the term “academic malpractice” without an individualized assessment, but it’s the term I’m likely to use anyway. And while that may sound like overkill, recall that this isn’t simply a USNWR gimmick. It benefits students to have high quality career advising and mentoring for their future professional careers, particularly as economic challenges arise in the near future.

(The same is true for bar passage, but at many schools, I think, the value will largely be in ensuring that students get over the finish line at the end of the day if they fail the bar exam on the first attempt. The state-specific relative metric of the bar exam makes it tougher to quantify here. So the same is true, I think, just to a smaller degree, of bar support more generally.)

3. Changes to lawyer and judge peer reputation surveys.

One more methodological change of note:

Legal professionals – including hiring partners of law firms, practicing attorneys and judges – rated programs' overall quality on a scale from 1 (marginal) to 5 (outstanding), and were instructed to mark "don't know" for schools they did not know well enough to evaluate. A school's score is the average of 1-5 ratings it received across the three most recent survey years. U.S. News administered the legal professionals survey in fall 2023 and early 2024 to recipients that law schools provided to U.S. News in summer 2023. Of those recipients surveyed in fall 2022 and early 2023, 43% responded. For this edition, U.S. News complemented these ratings by surveying partners at big law firms, sampled based on their size – larger firms were more frequently surveyed – while establishing geographic dispersion. Leopard Solutions, which partnered with U.S. News on its Best Companies to Work For: Law Firms list, provided U.S. News with the contacts from which a sample was drawn.

USNWR recognized that as schools “boycotted” the survey, they would have a smaller universe of lawyers and judges to survey. In the past, schools submitted 10 names (up to ~2000 names). The response rate was quite low, so USNWR used a three-year average. As schools stop submitting names, USNWR looked elsewhere.

And it deliberately selected a category: “partners at big law firms, sampled based on their size—larger firms were more frequently surveyed.”

In “Where Do Partners Come From?,” Professor Ted Seto tracked where NLJ 100 law firm partners came from—partners at the largest law firms. The data is from 2012, but we know that partnership in large law firms is also not susceptible to significant fluctuations. Here’s the top 20, with the raw number of partners listed:

1 Harvard 946

2 Georgetown 729

3 NYU 543

4 Virginia 527

5 Columbia 516

6 George Washington 447

7 Michigan 444

8 Chicago 426

9 Texas 384

10 Northwestern 365

11 Pennsylvania 329

12 Boston University 317

13 Fordham 306

14 UC Berkeley 287

15 UCLA 257

16 Yale 253

17 Stanford 240

18 UC Hastings 233

19 Duke 219

20 Boston College 213

These 20 schools are nearly all in the “top 20” or just outside of it in the USNWR rankings, and the handful that fall outside (e.g., George Washington, UC Law SF formerly Hastings) have, at varying times, been closer to the “top 20”. We can expect some affinity (or bias) for these partners’ home institutions, and perhaps for “peer” institutions as well (e.g., where their fellow partners at their firms attended school).

With almost clinical precision, then, USNWR has opted for a category to “complement” the survey that is likely to benefit the most elite law schools.

So, did it work? Well, to be fair, perhaps my assumption is wrong.

It’s worth noting that 11 of the “top 14” schools are experiencing all-time lows, either new lows or lows that tie previous lows, since USNWR began this metric in 1998 in the lawyer and judge survey category. Here’s the score in this category (on a 1-5 scale), with the comparison of the all time high for each school.

Stanford: 4.7 (all-time high: 4.9)

Harvard: 4.6 (4.9)

Chicago: 4.6 (4.8)

Columbia: 4.5 (4.8)

Yale: 4.5 (4.9)

Michigan: 4.4 (4.7)

Virginia: 4.4 (4.6)

Duke: 4.3 (4.5)

NYU: 4.3 (4.6)

Berkeley: 4.3 (4.6)

Georgetown: 4.2 (4.5)

Penn saw a decline from 4.4 to 4.3, and Cornell saw a decline from 4.3 to 4.2, but neither was an all-time low. Only Northwestern saw its score stable at 4.3 and not experience an all-time low. UPDATE: I mistakenly had Virginia’s previous high at 4.5 instead of 4.6.

Compare that to the next 86 schools in this category that have been ranked since 1998, and just 6 others experienced all-time low, again either new lows or lows that tie previous lows.

What could cause this disparity? Causation is tough to identify here, but let me posit two things.

First, we are seeing the slow phase-out of “boycotting” schools’ data inputs. Now two-thirds of the schools’ data is out of the mix. Schools were inclined to include their own supporters, and they are gone. Now, it’s hard to say that this is happening with such clinical precision at only the elite law schools and nowhere else. But perhaps a lot of elite law schools boycotted, and there’s some tag along effects as elite schools tend to rate elite schools comparably. That said, “complementing” with big law partner data should help shore up these figures, but perhaps there’s not enough here (indeed, we have no idea how they are mixed in with the data).

Second, it is possible that lawyers and judges—and perhaps in particular big law firm partners—are generally viewing elite law schools with less and less respect than at any time in recent history, and perhaps more so in the last year than at any time else. It might be law student or university protests about the Gaza conflict, fossil fuels, free speech—pick a cause. And perhaps the brunt of that publicity (and perhaps actual events) is falling on the most elite schools, which is creating fallout to their reputations in the legal community more generally. But that is very hard to assume and to pinpoint, and one might want to see what happens next year.

Neither is a perfect causal explanation, but both offer some possibilities to consider. Now, again, I would have expected the new methodology to help entrench elite schools, but this year it seems not to have done so.

We shall see what happens next year. Will three-year averages of some categories be in store? Or will USNWR introduce other categories (e.g., if big law firm partners merit special surveys, shouldn’t such outcomes for employment merit special weight?) consistent with concerns that the methodology ought to value certain things more than it has in the past?

There’s much more to discuss, of course, but this is my first take on the methodological changes in particular and the noteworthy change of reputation scores among the legal profession among a cohort of law schools.

What percentage of law school faculty have recently contributed to political candidates?

My recent post, “Law school faculty monetary contributions to political candidates, 2017 to early 2023,” has garnered a lot of attention and feedback, and I’m grateful for people’s interest in it! Some recurring questions came up.

First, what does this say about the percentage of politically-engaged faculty (an important question raised by Professor Milan Markovic and others)?

I tracked around 3300 faculty. That could double-count some faculty who moved around, so it could be smaller. And some could self-identify as a “law professor” but not teach at the law school (e.g., a business law professor in an undergraduate), another way it could be smaller.

But I sense I am somewhat understating the results, as I know of a non-trivial number of faculty (some of whom followed up with me after this post!) who are not included because they failed to list their occupation; listed some other occupation, like attorney or teacher; or whose title is, say, “professor of legal writing,” “professor of the practice,” or “Williams Chair in Constitutional Law.”

That’s all a hedge, and let’s set it aside for a moment. Based on what we know, what does it say about political engagement?

The window of contributions was a five-plus year window, 2017 to early 2023. I looked at law school faculties from 2022. That worked out to around 35.5% of “full time” faculty who contributed in this time (9195 faculty).

I looked at faculty that hit the 50% mark in terms of contributions, with, of course, all of the caveats I’ve listed.

School D R Both Pct
American 56 2   81.7%
Barry 19 1   71.4%
Irvine 37 1   67.9%
Widener Commonwealth 9 1   66.7%
Hastings 43     66.2%
NYLS 30     63.8%
Pace 24     61.5%
New Mexico 23     60.5%
Fordham 46 1 1 59.3%
Rutgers 59 2   56.5%
Wake Forest 27     56.3%
CUNY 31     55.4%
Loyola Los Angeles 36   1 55.2%
Indiana-Bloomington 27     55.1%
Catholic 9 3 1 54.2%
George Washington 49 3   54.2%
Cooley 21   1 53.7%
Atlanta's John Marshall 8     53.3%
Montana 8     53.3%
California Western 19     52.8%
Utah 20     52.6%
Chicago Kent 25     52.1%
SMU 21     50.0%
Illinois-Chicago 21     50.0%

One might be hard pressed to see any rhyme or reason for particular schools on the list or off the list. It’s possible, for instance, that some DC schools (like American, Catholic, and George Washington) attract disproportionately higher donors with some hope of service in a future administration. It’s possible that some schools’ faculty members (Irvine, Katie Porter) or famous alumni (Hastings now UC Law SF, Kamala Harris) prompted more donations. Only three of the Princeton Review’s “most liberal law students” (American, Irvine, George Washington) appear on the list.

School D R Both Pct
Regent   1   4.2%
Lincoln Memorial 1     4.5%
Faulkner   1   5.9%
North Dakota 1     6.3%
Idaho 3     7.9%
South Dakota 2     10.0%
LSU 4 1   12.8%
Southern 6 1   13.0%
Ave Maria 2 1   13.0%
Widener Delaware 4     13.3%
Indiana-Indianapolis 6     13.6%
Gonzaga 4     14.8%
Liberty 1 2   15.0%
Loyola New Orleans 6     15.0%
Baylor 5     16.7%
St. Thomas (Florida) 7     16.7%
Capital 4     17.4%
Mississippi College 4     17.4%
Villanova 8     17.8%
Tulane 9 1   18.2%
Wyoming 4     18.2%
Elon 7     18.4%
Texas A&M 11     18.6%
Washington University 18     18.8%
St. Thomas (Minnesota) 5     19.2%
Cleveland State 6     19.4%

Again, it’s interesting to return to the Princeton Review rankings of the “most conservative students.” Six of those top ten (Ave Maria, Regent, Faulkner, LSU, Idaho, Mississippi College) make the list of the least politically engaged faculty.

So there are varying things to consider. On the whole, contributions hover around 1/3 of the reported faculty (perhaps a bit higher but perhaps not by much) in the last five years. At a handful of schools (perhaps for some reasons), contributions are much higher or much lower as a percentage of overall faculty. It could be that political engagement is happening elsewhere. That said, with 1/3 of faculty reporting and nearly 96% of them going to Democrats, I am not sure that it is masking substantial numbers of Republican contributors who are simply sitting on the sidelines—but, perhaps a few more, if one sees how political engagement shakes out among those least-active institutions.

There are some other contribution figures to consider, perhaps for later posts.

Analysis of first-time bar passage data for Class of 2023 and ultimate bar passage data for Class of 2021

The ABA has released its new batch of data on bar passage. The data includes the first-time passage data for the Class of 2023 and the “ultimate” passage data for the Class of 2021. As I noted earlier, USNWR has increased the weight on bar passage as a metric (18% of the methodology is for first-time passage, 7% for ultimate), and it is one of the biggest metrics. It is also one of the most volatile metrics.

To offer a snapshot of what the data means, I looked at both the first-time and ultimate passage data. I compared schools’ performance against their Class of 2022 and 2020 metrics. I weighed the data the way USNWR does for a point of comparison.

Note that USNWR has not yet released its latest rankings for Spring 2024. That will include the Class of 2022 and 2020 metrics. This new batch of data will appear on rankings released in 2025.

Here are the schools projected to improve in this metric (which, again, under the current methodology, is 25% of the rankings) over the Classes of 2022 and 2020. The numbers below show the change in score; that is, they show how much a school is projected to improve or decline in the scoring. It is not the bar passage data, which is a comparative metric that can be harder to make meaningful if viewed simply in raw terms. That said, these numbers are, in their own way, meaningless, as they are just one factor among several.

Pontifical Catholic University of P.R. 0.316441

Appalachian School of Law 0.2564

Texas Southern University Thurgood Marshall School of Law 0.230382

Widener University-Delaware 0.225445

Northern Kentucky University 0.201138

Stetson University College of Law 0.188107

Villanova University 0.179214

Miami, University of 0.158228

Kansas, University of 0.152897

Albany Law School 0.141105

Baltimore, University of 0.129179

Texas Tech University 0.127884

Southern Illinois University 0.127633

Cincinnati, University of 0.127493

Saint Louis University 0.122641

North Carolina Central University 0.118442

Pittsburgh, University of 0.116755

Memphis, University of 0.106309

Vanderbilt University 0.104692

Boston College 0.102051

Here are the schools projected to decline in this metric over the Classes of 2022 and 2020.

Willamette University -0.41276

New Hampshire, University of -0.3903

Illinois, University of -0.32793

Case Western Reserve University -0.32763

Florida A&M University -0.31228

Ohio Northern University -0.30423

City University of New York -0.25318

Kentucky, University of -0.20322

Southern University -0.18699

Missouri, University of -0.17881

Puerto Rico, University of -0.166

Seattle University -0.16593

Pennsylvania State-Dickinson Law -0.15274

Regent University Law School -0.14854

Tulsa, University of -0.14119

Colorado, University of -0.1361

Gonzaga University -0.1291

Cleveland State University College of Law -0.12842

California Western School of Law -0.12533

St. Thomas University (Florida) -0.12298