How did Big Law survive the "Death of Big Law"?

The second in an occasional series I call “dire predictions.”

In 2010, Professor Larry Ribstein published a piece called The Death of Big Law in the Wisconsin Law Review. Here are a few of the more dire claims Professor Ribstein made:

  • “Big Law’s problems are long-term, and may have been masked until recently by a strong economy, particularly in finance and real estate. The real problem with Big Law is the non-viability of its particular model of delivering legal services.”

  • “When big firms try to expand without the support structure they are prone to failure. Big Law recently has been subject to many market pressures that have exposed its structural weakness. The result, not surprisingly, is that large law firms are shrinking or dying and smaller firms that do not attempt to mimic the form of Big Law are rising in their place.”

  • “These Big Law efforts to stay big are not, however, sustainable. Hiring more associates makes it harder for firms to provide the training and mentoring necessary to back their reputational bond.”

  • “In a nutshell, these firms need outside capital to survive, but lack a business model for the development of firm-specific property that would enable the firms to attract this capital. These basic problems have left Big Law vulnerable to client demands for cheaper and more sophisticated legal products, competition among various providers of legal services, and national and international regulatory competition. The result is likely to be the end of the major role large law firms have played in the delivery of legal services.”

  • “The death of Big Law has significant implications for legal education, the creation of law and the role of lawyers. First, a major shift in market demand for law graduates ultimately will affect the demand for and price of legal education. Big Law’s inverted pyramid, by which law firms can bill out even entry-level associate time at high hourly rates, has created a high demand and escalating pay for top law students. The pressures on Big Law discussed throughout this Article are ending this era with layoffs, deferrals, pay reductions, and merit-based pay.”

The late Professor Ribstein’s piece is only one such article in a movement of pieces that arose in the 2009-2010 reaction to the financial crisis. But large law firms appear to be thriving and continue to hire associates at ever-increasing clips among new law school graduates. Two charts to consider.

First, the number of law firms with gross total annual revenue exceeding $1 billion has climbed swiftly over the last decade or so. There were just 13 such firms in 2011, but 52 in 2021 (and down to 50 in 2022). True, inflation can account for rising total revenue. But it also reflects large law firms staying large—or becoming larger. (Figures from law.com AmLaw annual reports.)

Second, law student placement in those jobs. For the Class of 2011, nearly 4700 graduates ended up in those positions, just over 10% of the graduating class. Since then, graduating classes have shrunk by several hundred students, which has helped the overall placement rate as a percentage of graduates. But raw placement has nearly doubled in the last decade, too, to over 8500 for the Class of 2022, or nearly 25% of the graduating class.

Of course, one could find ways that “Big Law” is changing, whether that’s through the use of technology, the relationships it has with clients, its profits and salary structure, whatever it may be.

But “Big Law,” despite the dire predictions in the midst of the financial crisis, does not appear anywhere close to dead. To the extent there are large firms aggregating attorneys, with partners sharing significant profits among themselves and hiring a steady stream of associates for large and sophisticated work of large corporate clients, the model does not appear dead, but growing. Perhaps other types of disruption will appear in the future to change this model. But the financial stability of the model appears largely intact.

Overall legal employment for the Class of 2022 improves slightly, with large law firm and public interest placement growing

The aftermath of a pandemic, bar exam challenges, or a softening economy didn’t dampen the employment outcomes for law school graduates in 2022. Outcomes improved a touch. Below are figures for the ABA-disclosed data (excluding Puerto Rico’s three law schools). These are ten-month figures from March 15, 2023 for the Class of 2022.

  Graduates FTLT BPR Placement FTLT JDA
Class of 2012 45,751 25,503 55.7% 4,218
Class of 2013 46,112 25,787 55.9% 4,550
Class of 2014 43,195 25,348 58.7% 4,774
Class of 2015 40,205 23,895 59.4% 4,416
Class of 2016 36,654 22,874 62.4% 3,948
Class of 2017 34,428 23,078 67.0% 3,121
Class of 2018 33,633 23,314 69.3% 3,123
Class of 2019 33,462 24,409 72.9% 2,799
Class of 2020 33,926 24,006 70.8% 2,514
Class of 2021 35,310 26,423 74.8% 3,056
Class of 2022 35,638 27,607 77.5% 2,734

Placement is very good. There was an increase of over 1000 full-time, long-term bar passage-required jobs year-over-year, and the graduating class size was the largest since 2016. It yielded a placement of 77.5%. J.D. advantage jobs decreased somewhat, perhaps consistent with a hot law firm market last year.

It’s remarkable to compare the placement rates from the Class of 2012 to the present, from 56% to 78%. And it’s largely attributable to the decline in class size.

Here’s some comparison of the year-over-year categories.

FTLT Class of 2021 Class of 2022 Net Delta
Solo 234 160 -74 -31.6%
2-10 5,205 5,070 -135 -2.6%
11-25 2,004 2,115 111 5.5%
26-50 1,218 1,360 142 11.7%
51-100 1,003 1,175 172 17.1%
101-205 1,143 1,246 103 9.0%
251-500 1,108 1,145 37 3.3%
501+ 5,740 6,137 397 6.9%
Business/Industry 3,070 2,797 -273 -8.9%
Government 3,492 3,591 99 2.8%
Public Interest 2,573 2,875 302 11.7%
Federal Clerk 1,189 1,130 -59 -5.0%
State Clerk 2,094 2,053 -41 -2.0%
Academia/Education 328 375 47 14.3%

The trend continues last years uptick in public interest placement, which is not an outlier. Public interest job placement is up over 100% since the Class of 2017. These eye-popping number continue to rise. It is likely not an understatement to say that law students are increasingly oriented toward public interest, and that there are ample funding opportunities in public interest work to sustain these graduates. (I include a visualization of the trend of raw placement into these jobs here.)

Sole practitioners continue to slide significantly (they were in the low 300s not long ago in raw placement).

Additionally, extremely large law firm placement continues to boom. Placement is up more than thousands graduates in the last several years. Placement in firms with at least 101 attorneys is around 8500. Nearly 25% of all law school graduates landed in a “Big Law” firm, and more than 30% of those who were employed in a full-time, long-term, bar passage-required job landed in a “Big Law” firm.

Federal clerkship placement has dropped a bit, perhaps because more judges are hiring those with work experience rather than recent graduates, or perhaps because the pool of potential candidates is shrinking as more judges hire students for multiple clerkships.

Some law schools fundamentally misunderstand the USNWR formula, in part because of USNWR's opaque methodology

Earlier this week, USNWR announced it was indefinitely postponing release of its law school rankings, after delaying their release one week. It isn’t the first data fiasco that’s hit USNWR in law rankings. In 2021, it had four independent problems, two disputed methodology and two disputed data, that forced retraction and recalculation.

There are likely obvious problems with the data that USNWR collected. For instance, Paul Caron earlier noted the discrepancies in bar passage data as released by the ABA. I noticed similar problems back in January, but (1) I remedied some of them and (2) left the rest as is, assuming, for my purposes, close was good enough. (It was.) The ABA has a spreadsheet of data that it does not update, and individual PDFs for each law school that it does update—that means any discrepancies that are corrected must later be manually supplemented to the spreadsheet. It is a terrible system. It is exacerbated by the confusing columns that ABA uses to disclose data. But it only affected a small handful of schools. It is possible USNWR has observed this issue and is correcting it. And it is possible this affects a small number of schools.

A greater mistake advocated by law school deans, however, relates to employment data. Administrators and deans at Yale, Harvard, and Berkeley, at the very least, have complained very publicly to Reuters and the New York Times that their employment figures are not accurate.

They are incorrect. It reflects a basic misunderstanding of the USNWR data, but it is admittedly exacerbated by how opaque USNWR is when disclosing its metrics.

In 2014, I highlighted how USNWR publicly shares certain data with prospective law students, but then conceals other data that it actually uses in reaching its overall ranking. This is a curious choice: it shares data it does not deem relevant to the rankings, while concealing other data that is relevant to the rankings.

The obvious one is LSAT score. USNWR will display the 25th-75th percentile range of LSAT scores. But it uses the 50th percentile in its ranking. That could be found elsewhere in its publicly-facing data if one looks carefully. And it is certainly available in the ABA disclosures.

Another less obvious one is bar passage data. USNWR will display the first-time pass rate of the school in the modal jurisdiction, and that jurisdiction’s overall pass rate. But it uses the ratio of first-timers over the overall pass rate, a number it does not show (but simple arithmetic makes easier). And in recent years, it now uses the overall rate from all test-takers across all jurisdictions, which it also does not show. Again, this is certainly available in the ABA disclosures.

Now, on to employment data. As my 2014 post shows, USNWR displays an “employed” statistics, for both at-graduation and 9 or 10 months after graduation. But it has never used that statistic in its rankings formula (EDIT: in recent years—in the pre-recession days, it weighed employment outcomes differnetly). It has, instead, weighed various categories to creates its own “employment rank.” That scaled score is used in the formula. And it has never disclosed how it weighs the other categories.

Let’s go back to what USNWR publicly assured law schools earlier this month (before withdrawing this guidance):

The 2023-2024 Best Law Schools methodology includes:

. . .

Full credit for all full-time, long-term fellowships -- includes those that are school funded -- where bar passage is required or where the JD degree is an advantage

Maximum credit for those enrolled in graduate studies in the ABA employment outcomes grid

Note that the methodology will give “full credit” or “maximum credit” for these positions. That is, its rankings formula will give these positions, as promised to law schools based on their complaints, full weight in its methodology.

I had, and have, no expectation that this would change what it publicly shares with prospective law students about who is “employed.” Again, that’s a different category, not used in the rankings. I assume, for instance, USNWR believes its consumers do not consider enrollment in a graduate program as being “employed,” so it does not include them in this publicly-facing metric.

Now, how can law schools know that this publicly-facing metric is not the one used in the rankings methodology, despite what USNWR has said? A couple of ways.

First, as I pointed out back in January, “I assume before they made a decision to boycott, law schools modeled some potential results from the boycott to determine what effect it may have on the rankings.” So law schools can use their modeling, based on USNWR own public statements, to determine where they would fall. My modeling very closely matches the now-withdrawn rankings. Indeed, Yale was the singled greatest beneficiary of the employment methodology change, as I pointed out back in January. It is very easy to run the modeling with school-funded and graduate positions given “full weight,” or given some discounted weight, and see the difference in results. It is impossible for Yale to be ranked #1 under the old formula—that is, in a world where its many graduates in school-funded or graduate positions did not receive “full weight” in the methodology. Again, very simple, publicly-available information (plus a little effort of reverse-engineering the employment metrics from years past) demonstrates the outcomes.

Second, USNWR will privately share with schools subscribing to its service an “employment rank.” This raw “rank” figure is the output of the various weights it gives to employment metrics. It does not reveal how it get there; but it does reveal where law schools stand.

It takes essentially no effort to see that the relationship between the “employment” percentage and the “employment rank” is pretty different or will look largely the same. And that’s even accounting for the fact that the “rank” can include subtle weights for many different positions. At schools like Yale, there are very few variables. In 2021, it had students in just 10 categories. And given that a whopping 30 of them were in full-time, long-term, law school funded bar passage required positions, and another 7 in graduate programs, the mismatch between “employment” percentage and “employment rank” should be obvious, or the two categories should match pretty cleanly.

Third, one can also reverse engineer the “employment rank” to see how USNWR gives various weight to the various ABA categories. This takes some effort, but, again, it is entirely feasible to see how these jobs are given various weights to yield a “rank” that looks like what USNWR privately shares. And again, for schools that run these figures themselves, they can see if USNWR is actually giving full “weight” to certain positions or not.

USNWR’s opaque approach to “employment rank” certainly contributes to law schools misunderstanding the formula. But law schools—particularly elite ones who initiated the boycott and insisted they do not care about the rankings, only now to care very much about them—should spend more effort understanding the methodology before perpetuating these erroneous claims.