H/T to Scott Fruehwald for calling attention to Kevin Bennardo & Alexa Chew (UNC), Citation Stickiness, 20 Journal of Appellate Practice & Process, Forthcoming, in his Legal Skills Prof Blog post Are Lawyers Citing the Best Cases to Courts? Scott Fruehwald solicits comments to his post. One interesting question is whether we have to start teaching legal research differently because of the results of Bennardo & Chew’s empirical study.

Here’s the abstract to Citation Stickiness:

This Article is an empirical study of what we call citation stickiness. A citation is sticky if it appears in one of the parties’ briefs and then again in the court’s opinion. Imagine that the parties use their briefs to toss citations in the court’s direction. Some of those citations stick and appear in the opinion — these are the sticky citations. Some of those citations don’t stick and are unmentioned by the court — these are the unsticky ones. Finally, some sources were never mentioned by the parties yet appear in the court’s opinion. These authorities are endogenous — they spring from the internal workings of the court itself.

In a perfect adversarial world, the percentage of sticky citations in courts’ opinions would be something approaching 100%. The parties would discuss the relevant authorities in their briefs, and the court would rely on the same authorities in its decision-making. Spoiler alert: our adversarial world is imperfect. Endogenous citations abound in judicial opinions and parties’ briefs are brimming with unsticky citations.

So we crunched the numbers. We analyzed 325 cases in the federal courts of appeals. Of the 7552 cases cited in those opinions, more than half were never mentioned in the parties’ briefs. But there’s more — in the Article, you’ll learn how many of the 23,479 cases cited in the parties’ briefs were sticky and how many were unsticky. You’ll see the stickiness data sliced and diced in numerous ways: by circuit, by case topic, by an assortment of characteristics of the authoring judge. Read on!

H/T to beSpacific for calling attention to Kristina Niedringhaus’ Is it a “Good” Case? Can You Rely on BCite, KeyCite, and Shepard’s to Tell You?, JOTWELL (April 22, 2019) (reviewing Paul Hellyer, Evaluating Shepard’s, KeyCite, and BCite for Case Validation Accuracy, 110 Law Libr. J. 449 (2018)). Here’s a snip:

Hellyer’s article is an important read for anyone who relies on a citator for case validation or, determining whether a case is still “good” law. The results are fascinating and his methodology is thorough and detailed. Before delving into his findings, Hellyer reviews previous studies and explains his process in detail. His dataset is available upon request. The article has additional value because Hellyer shared his results with the three vendors prior to publication and describes and responds to some of their criticisms in his article, allowing the reader to make their own assessment of the critique.

From the abstract for Eric P. Voigt, Legal Research Demystified: A Step-by-Step Approach (Carolina Academic Press, Apr. 2019):

Legal Research Demystified offers a real-world approach to legal research for first-year law students. The book guides students through eight steps to research common law issues and ten steps to research statutory issues. It breaks down the research steps and process into “bite-size” pieces for novice researchers, minimizing the frustration often associated with learning new skills. This text also gives students context, explaining why and when a source or finding tool should be used when researching the law. The process of legal research, of course, is not linear. This book constantly reminds students of the recursive nature of legal research, and it identifies specific situations when they may deviate from the research steps.

Through the book’s step-by-step approach, students will connect seemingly unrelated tools (e.g., citators and the Key Number System) and understand how to leverage them to answer legal questions. Every chapter includes charts, diagrams, and screen captures to illustrate the research steps and finding methods. Each chapter concludes with a “summary of key points” section that reinforces important concepts from the chapter.

This book provides students and professors with multiple assessment tools. Each chapter ends with true-false and multiple-choice questions that test students’ understanding of chapter content. These questions are replicated on the book’s companion website, Core Knowledge. Students may answer these end-of-chapter questions, as well as more advanced questions, on Core Knowledge and receive immediate feedback, including an explanation of why the answer is correct or incorrect. Professors can generate reports to track students’ performance. Based on students’ performance, professors will know whether to review a topic in more detail or to move to the next topic. (New books contain an access code to Core Knowledge; students purchasing used books can buy an access code separately.)

Core Knowledge offers yet another assessment tool: interactive research exercises. These online exercises walk students through the research steps on Westlaw and Lexis Advance, giving professors the option to “flip” the classroom. Through many screen captures and tips, students can navigate both research platforms outside of class, allowing students and professors to dig deeper into the material during class. Each research exercise simulates a real-world research experience and contains self-grading questions. For example, in one exercise, students research on Westlaw to determine whether the client could recover damages against a neighbor for the emotional distress for the death of the client’s dog. To answer the client’s question, students must complete the research steps, including finding and reviewing secondary sources on Westlaw, using the Key Number System and KeyCite, and performing keyword searches.

For upper-level law students, law clerks, and attorneys, Paul D. Callister’s Field Guide to Legal Research (West Academic Publishing, Mar. 11, 2019) “is not another exhaustive treatise but a concise, working person’s guide to solving complex legal research problems. Much like a field guide, this book classifies problem types and matches them with appropriate legal research resources. It emphasizes “working the problem,” “problem typing,” and then application of problem types to the appropriate resources. Problems and exercises illustrate the application of constructs and techniques to particular situations. Coverage is much broader than in first-year legal research classes. The book includes problems based on government agencies, statistics, and even patent law. There are numerous “screen shots” and images to facilitate the learning process.” Recommended.

From the abstract for Vicenç Feliú, Moreau Lislet: The Man Behind the Digest of 1808:

The Louisiana legal system is unique in the United States and legal scholars have been interested in learning how this situation came to pass. Most assume that the origin of this system is the Code Napoleon and even legal scholars steeped in Louisiana law have a hard time answering the question of the roots of the Louisiana legal system. This book solves the riddle through painstaking research into the life of Lois Moreau Lislet, the driving force behind the Digest of 1808.

From the blurb for Kendall Svengalis, A Layperson’s Guide to Legal Research and Self-Help Law Books (New England Press 2018):

This unique and revolutionary new reference book provides reviews of nearly 800 significant self-help law books in 85 subject areas, each of which is proceeded by a concise and illuminating overview of the subject area, with links to online sources for further information. The appendices include the most complete directory of public law libraries in the United States. This is an essential reference work for any law, public, or academic library which fields legal questions or inquiries.

Highly recommended.

From the abstract for Marin Dell, Fake News, Alternative Facts, and Disinformation: The Importance of Teaching Media Literacy to Law Students (Dec. 2018):

Like legal education, media literacy education teaches critical thinking skills. Students with media literacy education are able to evaluate media messages and decide for themselves the truth of media. Media literacy education is critical at all levels, but it should be a required inclusion for every legal education program.

Annalee Hickman believes that the best way to start teaching students about the many facets of legal research is by using supplemental readings. In Engaging Legal Research Students Through Supplemental Readings from the Last Decade, 26 Perspectives: Teaching Legal Res. & Writing 67 (2018) she offers an short annotated bibliography of readings intended to engage students in their LRW classes. Useful.

From the introduction to Lara Freed and Joel Atlas’s A Structural Approach to Case Synthesis, Fact Application, and Persuasive Framing of the Law, 26 Perspectives: Teaching Legal Res. & Writing 50 (2018):

Among the thorniest of [lawyering-skills] are synthesizing cases, applying facts, and persuasively framing the law. Professors struggle to teach these skills, and students consistently struggle to understand and implement them. To lighten the burden for both professors and students, we have approached these skills structurally and, in doing so, have identified the fundamental components of the skills and common pitfalls associated with understanding and implementing them. With this foundation, we have created teaching models and examples that provide professors with a systematic, refined method for helping students acquire these skills.

The Fall 2018 issue is out now and available online here. Articles in this issue include:

  • Finding Your Muse by Professor Abigail L. Perdue
  • A Structural Approach to Case Synthesis, Fact Application, and Persuasive Framing of the Law By Professors Lara Freed and Joel Atlas
  • How to Support International ELL Law Students When You Only Have a Few of Them By Professor Sue Liemer
  • Engaging Legal Research Students Through Supplemental Readings from the Last Decade By Professor Annalee Hickman
  • Alliteration, Restraint, and a Mind at Work By Professor Patrick Barry
  • Show and Tell By Patrick Barry
  • Reflections of a Legal Writing Advisor By Professor Paul Von Blum

From the abstract for Ryan Doerfler, Can a Statute Have More Than One Meaning? 94 New York University Law Review ___ (2019, Forthcoming):

What statutory language means can vary from statute to statute, or even provision to provision. But what about from case to case? The conventional wisdom is that the same language can mean different things as used in different places within the United States Code. As used in some specific place, however, that language means what it means. Put differently, the same statutory provision must mean the same thing in all cases. To hold otherwise, courts and scholars suggest, would be contrary both to the rules of grammar and to the rule of law.

This Article challenges that conventional wisdom. Building on the observation that speakers can and often do transparently communicate different things to different audiences with the same verbalization or written text, it argues that, as a purely linguistic matter, there is nothing to prevent Congress from doing the same with statutes. More still, because the practical advantages of using multiple meanings — in particular, linguistic economy — are at least as important to Congress as to ordinary speakers, this Article argues further that it would be just plain odd if Congress never chose to communicate multiple messages with the same statutory text.

As this Article goes on to show, recognizing the possibility of multiple statutory meanings would let courts reach sensible answers to important doctrinal questions they currently do their best to avoid. Most notably, thinking about multiple meanings in an informed way would help courts explain under what conditions more than one agency should receive deference when interpreting a multi-agency statute. Relatedly, it would let courts reject as false the choice between Chevron deference and the rule of lenity for statutes with both civil and criminal applications.

H/T beSpacific.

Paul Hellyer’s Evaluating Shepard’s, KeyCite, and BCite for Case Validation Accuracy, 110 LLJ 449 (2018)(paywalled), “evaluates and compares how accurately three legal citators (Shepard’s, KeyCite, and BCite) identify negative treatment of case law, based on a review of 357 citing relationships that at least one citator labeled as negative. In this sample, Shepard’s and KeyCite missed or mislabeled about one-third of negative citing relationships, while BCite missed or mislabeled over two-thirds. The citators’ relative performance is less clear when examining the most serious citator errors, examples of which can be found in all three citators.” Recommended.

H/T to Jason Wilson. His recent Linkin post calls attention to Kevin Bennardo, Abandoning Predictions, 16 Legal Communication & Rhetoric ___ (2019 Forthcoming). Here’s the abstract for this interesting essay:

Analytical documents are a hallmark of the law school legal writing curriculum and of the practice of law. In these documents, the author usually applies a body of law to a set of facts and reaches a conclusion. Oftentimes, that conclusion is phrased as a prediction (“The court is likely to find…”), and many academics even refer to analytical documents as “predictive” document types. If that describes you, this Essay’s goal is to convince you to change your ways.

Simply put, there is a difference between conducting a legal analysis and predicting the outcome of a legal dispute. If the author of an analytical document has only conducted a legal analysis, they have no business claiming that they can predict the outcome of the dispute. That distinction should be recognized in the teaching of analytical document genres and should be conveyed by legal professionals in their communication of legal analyses.

From the abstract for Zahr Said & Jessica M. Silbey, Narrative Topoi in the Digital Age, 68 Journal of Legal Education ___ (Forthcoming):

Decades of thoughtful law and humanities scholarship have made the case for using humanistic texts and methods in the legal classroom. We build on that scholarship by identifying and describing three “narrative topoi” of the twenty-first century – podcasts, twitter and fake news. We use the term “topos” (from the Greek meaning “place”) and its plural, “topoi,” to mean “a literary commonplace” and “general setting for discussion” in the context of literary forms. Like an identifiable genre, narrative topoi are familiar story paths for audiences to travel. These narrative topoi live in contemporary popular culture and are products of digital technology’s capacity to share and shape communication in new ways that draw on older narrative conventions and forms. In a law school, drawing on new narrative topoi can reorient legal analysis through inquiry into twenty-first-century problems of language, narrative form, authenticity, and audiences. Legal educators may also highlight historical continuity between cultural and legal history and today’s forms and experiences, foregrounding issues central to legal skills, such as analogic reasoning, advocacy, counseling, and factual analysis. We address all of these points while exploring particular examples of these narrative topoi of our digital age.

Kevin P. Tobia has posted Testing Original Public Meaning (Nov. 6, 2018). Here’s the abstract:

Various interpretive theories recommend using dictionaries or corpus linguistics to provide evidence about the “original public meaning” of legal texts. Such an interpretive inquiry is typically understood as an empirical one, aiming to discover a fact about public meaning: How did people actually understand the text at the time it became law? When dictionaries or corpora are used for this project, they are empirical tools, which might be reliable or unreliable instruments. However, the central question about these tools’ reliability remains unanswered: Do dictionaries and corpus linguistics reliably reflect original public meaning?

This paper develops a novel method to assess this question. It begins by examining the public meaning of modern terms. It compares people’s judgments about meaning to the verdicts that modern dictionaries and corpus linguistics deliver about (modern) public meaning. Eight experimental studies (total N = 1,327) reveal systematic divergences among the verdicts delivered by ordinary concept use, dictionary use, and corpus linguistics use. For example, the way in which people today apply the concept of a vehicle is systematically different from the way in which people apply the modern dictionary definition of a “vehicle” or the modern corpus linguistics data concerning vehicles. Strikingly similar results arise across levels of legal expertise; participants included 999 ordinary people, 230 “elite-university” law students (e.g. at Harvard and Yale), and 98 United States judges. These findings provide evidence about the reliability of dictionaries and corpus linguistics in estimating modern public meaning. I argue that these studies also provide evidence about these tools’ reliability in estimating original public meaning, in historical times.

The paper develops both the positive and critical implications of these experimental findings. Positively, the results reveal systematic patterns of the use of dictionaries and corpora. Corpus linguistics tends to generate prototypical uses, while dictionaries tend to generate more extensive uses. This discovery grounds normative principles for improving the use of both tools in legal interpretation. Critically, the results support five argumentative fallacies that arise in legal-interpretive arguments that rely on corpus linguistics or dictionaries. More broadly, the results suggest that two central methods of determining original public meaning are surprisingly unreliable. This shifts the argumentative burden to public meaning originalism and other theories that rely upon these tools; those theories must provide a non-arbitrary account of these tools’ use and a demonstration that such methods are, in fact, reliable.

Michael A. Livermore, et al. have posted Law Search as Prediction (Nov. 9, 2018). Here’s the abstract:

The process of searching for relevant legal materials is fundamental to legal reasoning. However, despite its enormous practical and theoretical importance, law search has been given inadequate attention by scholars. In this article, we define the problem of law search, examine its normative and empirical dimensions, and investigate one particularly promising computationally based approach. We implement a model of law search based on a notion of search space and search strategies and apply that model to the corpus of U.S. Supreme Court opinions. We test the success of the model against both citation information and hand-coded legal relevance determinations.

Interesting.

From the abstract for Nancy J. White, Legal Analysis: There’s a Template for that! 2 ALSB Journal of Business Law & Ethics Pedagogy ___ (Forthcoming):

Legal analysis is often one of the more difficult skills to teach undergraduate and first-year law students. This skill, related to what is called “legal frame working” in law school, is needed by law students and lawyers, to analyze legal issues and requires the user of the legal framework to have a detailed understanding of the law. This same level of skill is not taught at the undergraduate level, but a basic understanding increases critical thinking and helps undergraduate students conceptualize how the law is used. This paper describes a simplified method for introducing the skill of legal analysis to undergraduate students and first-year law students using a legal analysis template developed by the author.

H/T beSpacific

Zillman’s A Quick Guide to Searching the Web, LRRX Sept 9, 2018, provides an explanation of four Internet search techniques: 1) Search Engines, 2) Indexes and Directories, 3) Intuitive Search and 4) Custom Search and Deep Web Search. “The intent of this guide,” writes Marcus Zillman, “is to broaden your search horizons so that searching the web will intuitively become easier, more focused and more effective.” Recommended.