"Nonjudicial Solutions to Partisan Gerrymandering"

I have posted this draft of an Essay forthcoming in the Howard Law Journal, “Nonjudicial Solutions to Partisan Gerrymandering.” Here is the abstract:

This Essay offers some hesitation over judicial solutions to the partisan gerrymandering, hesitation consistent Justice Frankfurter’s dissenting opinion in Baker v. Carr. It argues that partisan gerrymandering reform is best suited for the political process and not the judiciary. First, it traces the longstanding roots of the problem and the longstanding trouble the federal judiciary has had engaging in the process, which cautions against judicial intervention. Second, it highlights the weaknesses in the constitutional legal theories that purport to offer readily-available judicially manageable standards to handle partisan gerrymandering claims. Third, it identifies nonjudicial solutions at the state legislative level, solutions that offer more promise than any judicial solution and that offer the flexibility to change through subsequent legislation if these solutions prove worse than the original problem. Fourth, it notes weaknesses in judicial engagement in partisan gerrymandering, from opaque judicial decisionmaking to collusive consent decrees, that independently counsel against judicial involvement.

This Essay is a contribution to the Wiley A. Branton/Howard Law Journal Symposium, "We The People? Internal and External Challenges to the American Electoral Process." is

A continuing trickle of law school closures

UPDATE: as of late 2019, Western State is under new ownership and successfully petitioned for ABA accreditation, which renders portions of this post inaccurate.

One year ago today—March 22, 2018—I reflected on the “trickle” of law school closures. Some campuses closed (Cooley’s Ann Arbor branch, Atlanta’s John Marshall’s Savannah branch), two schools merged into one (William Mitchell and Hamline), and others announced their closure (Indiana Tech, Whittier, Charlotte, and Valparaiso). In the last year, Arizona Summit and Western State have announced their closures.

Western State closing two years after Whittier is a remarkable turn for legal education in Orange County, California. Orange County, with more than 3 million residents, is one of the most populous and fastest-growing counties in the United States.

California has long had a number of state-accredited schools in the state, schools that do not have full ABA accreditation. Western State has been around since the 1970s but was not the first school to gain full ABA accreditation—that was Whittier in 1978. Western State joined newcomer Chapman as fully accredited in 1998. Then UC-Irvine was accredited in 2011. But now two of those four schools have closed.

While we are a long way from the recession, and while law school enrollment has stabilized (and slightly improved) over the last few years, there remain longstanding pressures on legal education, in part from the legacy of the recession—small class sizes can only be sustained so long, scholarships have increased to attract students, the transfer market has disproportionately impacted more marginal schools, lower credentials of incoming students have translated into systemic lower bar passage rates, and so on.

We may still see a few more closures in the years ahead—for-profit schools schools have borne the brunt of the closures, but we’ll see what happens in the months to come.

The new arms race for USNWR law specialty rankings

The USNWR law “specialty” rankings long operated this way: schools would identify one faculty member whose specialty matched one of the various USNWR specialty categories (legal writing, trial advocacy, tax, etc.). USNWR would send a survey to those faculty asking them to list up to 15 of the top schools in those areas. USNWR would then take the top half of those schools who received a critical mass of votes, and rank them based upon who received the most votes—just ordinal rank with no total votes listed. For many specialty areas, that meant 10 to 20 schools. And for the other 180 to 190 schools, that meant blissful ignorance.

USNWR changed that methodology this year in a couple of ways. First, its survey asks voters to rank every school on the basis of this specialty on a scale of 1 to 5, similar to how the peer reputation survey works. Second, it ranks all the schools that received a critical mass of votes (i.e., about 10 votes—and most law professors are not shy about rating most schools). Third, it now lists that reputation score, ties and all.

The result is that almost all schools are ranked in almost all categories. And now your school might be 33d or 107th or 56th or something in a category.

The result in some categories is comical compression in some categories. A score of 2.0 (out of 5) gets you 91st in International Law, and a score of 1.0 (the bottom) gets you to 177th. Ties are abundant—after all, there are usually at least 180 schools ranked, and given that the scale is from 5.0 to 1.0, and that virtually all schools are in the 4.0 to 1.0 range, there are going to be a lot of ties.

Worse, now schools can advertise their top X program, when X in the past typically wouldn’t drop past 10 to 20. Now, top 30, top 50, top 100 all earn bragging rights.

So now there’s a new arms race. Schools know exactly where they sit in this year’s survey, how tantalizing close the next tranche of the ratings are (because of the ties), how much higher that ranking is (again, because of the ties), and the temptation to pepper prospective voters with more marketing materials in the ever-escalating race to climb the ranks of a new set of specialty rankings. In the past, it was blissful ignorance for those below 20th. Today, it’s all laid bare.

Perhaps I’m wrong. Maybe schools will mostly ignore the change to the specialty rankings. The compression and ties alone should cause most ot ignore them. But, I doubt it. The allure of rankings and the temptation of marketing departments to boast to prospective students and alumni about some figure (especially if that figure is higher than the overall USNWR rank) will, I think, overwhelm cooler heads.

Anatomy of a botched USNWR law ranking leak

For the past few years, USNWR has emailed all law schools deans an embargoed PDF listing the tentative law school rankings about a week before their formal release. And for the past few years, within minutes (and in disregard of that embargo), that email is leaked to a private consulting company, which then posts the rankings on its corporate blog, where the rankings then spread via social media and gossip sites.

This year, USWNR did something different. It released most of its graduate school rankings in an Excel spreadsheet on a password-protected site around 8 am ET on Tuesday, March 5. But it did not release the law school full time rankings, nor the business school rankings. (I guess we know which schools are beholden to these rankings and where USWNR sees its value!) 

Instead, shortly after, individuals at schools received their own school's ranking, and nothing more. This makes leaking much more challenging. If you leak your own school's ranking, it's obvious you leaked it, and USNWR may punish you by not giving you access to that embargoed data early next year. 

But around 5 pm ET on Tuesday, March 5, USNWR sent out a new update. Its Academic Insights database would now have the 2020 rankings data (that is, the rankings data to be released March 12, 2019). 

Academic Insights is a USNWR platform that law schools purchase a license to access and use. It has rankings data stretching back to the beginning. It offers multiple ways to view the data inside AI, or to pull the rankings data out of the database. 

It's user friendly, but it isn't always the easiest to operate, and like many web databases it can suffer from some wonky behavior. It makes leaking a trickier proposition.

Around 7 pm ET March 5, the private consulting company posted the rankings. But the rankings made it very obvious that there were errors, and it also provided clues about how those errors came about.

To leak this information to someone, some law school administrator made a data request from the database and exported the rankings information to a CSV file. The “Leaderboard” AI database is a swift way to see the ranking of law schools compared to one another across categories. (Recall that the database stretches back through the history of USNWR, so it includes all schools that were ever ranked over the last 30 years, whether or not they’re ranked, or even exist, this year.)

The list then included as “N/A” (i.e., “unranked” this year) schools like Arizona Summit and the University of Puerto Rico. This is unsurprising because USNWR doesn’t rank (1) provisionally-accredited schools, (2) schools under probation, and (3) the schools in Puerto Rico.

But the leaked ranking included other bizarre “unranked” choices: Hamline University; Pennsylvania State University (Dickinson) pre-2017; Rutgers, The State University of New Jersey--Camden; Rutgers, The State University of New Jersey--Newark; Widener University; and William Mitchell College of Law (among others). These schools all no longer exist (along with a couple of others that have announced closures). Why list them as “unranked”?

Separately, the leaked rankings omitted information for Penn State - University Park, Penn State - Dickinson, Rutgers, Widener University (Commonwealth), Widener University Delaware, and Mitchell|Hamline. Why aren’t these schools in the database?

These are obviously not random database omissions. They're omissions of schools that split or merged. Their old schools are in the database. But the Leaderboard database pull request omitted those schools. (Why, I don't know.)

But there are ways of requesting school-specific data. You could request the specific institutional data in the AI database for, say, Penn State - University Park or Rutgers, and the data is now available for your review—including those institutions’ ranks. Of course, a few schools might ultimately be "rank not published," or "Tier 2" schools in the rankings. But they're not "unranked."

(Incidentally, from the revealed metadata, we know a lot of information about which person at which law school leaked the rankings, but that’s not what this blog post is about.)

The real botching came when the leaked ranking included these strange inclusions and omissions (with some noticeable gaps—think two schools listed at 64, followed by a school ranked at 67, which means there’s an omission in the 64 ranking) was posted and began to spread. Panicked students and prospective students at places like Penn State and Rutgers asked what happened. The private consulting company replied that it “appeared” the schools were “unranked.” That spawned a great deal of speculation and worry on behalf of these students.

Of course, that wasn’t the case. The statements speculating that these schools appeared to be “unranked” were reckless—that is, they were based without an understanding of how the database operates and based instead on speculation—and false—because, as I noted, each of these omitted schools had a ranking in the database, simply not in the CSV leaked to this private consulting company. (Later statements began to concede that these schools would probably be ranked, but those statements came only after worry and misinformation had spread.)

I pushed back against this false news last week in a couple of social media outlets, because it does no good to perpetuate false rumors about these law schools. These law schools, I insisted, would probably be ranked. They were ranked at the very moment in the AI database; and, barring a change, they’ll be ranked when the rankings were released (i.e., now). (Of course, some schools, like those under probation or those in Puerto Rico, were never going to be ranked.) 

The backlash I received on social media was impressive. I confess, I'm not sure why so many prospective law students felt threatened by my insistence that someone had disclosed bad information about schools like Penn State and Rutgers to them! (Happily, such comments roll off easily.) After that, apparently, USNWR asked for those rankings to be taken down, and they were. (Of course, they still floated around social media and gossip sites.)

But we know that leaking USNWR information from the AI database presents complications for future leaks. Failure to understand how to operate the database may leave an incomplete and inaccurate picture, as occurred this year with the botched leak. We’ll see what USNWR does for the 2021 rankings—are total but accurate leaks better, or incomplete but inaccurate leaks better? We shall see.

And for those relying on leaks in the future? Read skeptically. The leak included material errors this year, and I wouldn't be surprised to see material errors in future leaks.

JD enrollment for 2019 on pace for another modest increase but slight quality decline

For the last year, LSAC has offered really useful tools for tracking the current status of JD applicants and LSAT test-takers. Data is updated daily and visualized cleanly.

We’re at a stage where we’d expect just about 80% of the applicant pool to be filled out, so it should be useful to see where things stand.

Applicants are up 3.1% year-over-year. Applicant increases don’t perfectly equate to increases in matriculants—last year, we saw an 8% applicant spike but a more modest 2.6% matriculant increase. But up is better than down, and we may see another modest increase in JD enrollment in 2019. We’ve seen modest increases the last several years—better for law schools than flat enrollment, but a continued reflection of a “new normal” in legal education.

A more disappointing development is that the quality of applicants has declined. Applicants with a 165 or higher LSAT score are down a few points. The bulk of increase in applicants comes from those scoring a 155-159. But then again, those with <140 LSAT scores are also in decline.

It remains to be seen how many would-be high-LSAT performers opted for the GRE in lieu of the LSAT, which may affect how LSAT scores are distributed among applicants. But it’s another reason to think that any increase in JD enrollment in 2019 will be lower than the increase in the size of the applicant pool—at least, if law schools seek to preserve the quality of their classes.