Tuesday, September 4, 2007

What Leiter’s Study Doesn’t Show: Overall Scholarly Impact

Much attention will surely be paid to Brian Leiter’s new ranking of top law schools based on "scholarly impact," but Deans and law faculties should keep in mind the study’s limitations when gauging their reactions to it.

Leiter may well be right that "one will learn more about faculty quality at leading American law schools" from his study of citation counts in legal publications "than from U.S. News." And Leiter is careful to note several caveats. But there is more to say about what is missing.

Leiter’s study is not a measure of overall scholarly impact, but only scholarly impact within a subset of the academy. The study is confined to the Westlaw JLR database which only includes legal publications.

What does this miss? Leading scholars will have an impact that ranges beyond their fields and beyond their nations. But the Westlaw database cannot measure impact beyond the legal academy, and the important global reach of many American legal scholars is not measured. All but a very few journals in the database are U.S.-based.

The impact of interdisciplinary scholars, in particular, will be under-counted. For serious interdisciplinary scholars, especially J.D./Ph.D.s, the true measure of scholarly success is to be seen as leading figure both within the legal academy and within the Ph.D. field. To further one’s scholarship within the Ph.D. field, an interdisciplinary scholar will publish in the field’s leading peer-reviewed journals. If in the humanities and perhaps social sciences, they will publish books.

This leads to two under-counting problems. First, the Westlaw JLR database will miss citations to the scholar’s work in journals other than law reviews. Second, legal scholars often confine their research to the same Westlaw database, and so they don’t find and cite to relevant books and articles.

Why should we care about this? If the focus is on ranking law schools, and undercounted faculty are evenly distributed across law schools, then the law school rankings may still be fine. But some law schools (for example the University of Southern California) have a higher percentage of faculty with Ph.D.s than many others. It strikes me that some law schools will be disadvantaged.

But also, individuals are singled out, with the top ten at each school listed. Once something is seen as "countable," and both law schools and individual faculty are identified based on how their numbers line up...you’ve seen the next step before. Law schools decide they need to move up the rankings. Perhaps they reward faculty based on how far up their numbers are. All of a sudden, that leading, peer-reviewed journal, outside the Westlaw JLR database, is no longer a great prize on your C.V. Your efforts on that were a waste of your law school’s resources.

Leiter has good intentions, but his rankings have a weakness that warrants more than a footnote. Every law school faculty should aspire to the broad impact, beyond the walls of the legal academy and beyond the nation’s borders, that his study does not measure.