Category Archives: Legal Research

Want to search an out-of-date version of the National Survey of State Laws? There’s a very expensive online legal search service for that!

Westlaw carries the full text of the sixth edition of the National Survey of State Laws online. Therein lies the problem. In addition to not stating online that the sixth edition has been superceded by the much more recent seventh edition (which Westlaw is going to publish online), the compiliers of the State Laws Survey have released two updates and four (or is it five?) new chapters that are not online. Bottom line: if you are using the National Survey of State Laws on Westlaw, you are searching eight-year-old topical state laws surveys. Make a note of that, researchers, at least until the seventh edition is online.

Time for the folks in the Land of 10,000 Invoices to get the seventh edition of this valuable resource uploaded and to keep it updated once it is. Perhaps Lexis or BNA can do a better publishing job for this title.– Joe

PS: A reader has commented that the seventh edition of the National Survey of State Laws is available, apparently since Jan. 12, 2016, on HeinOnline.

Has West Search sacrificed comprehensiveness?

As a quick follow-up to my earlier post titled 10,000 documents: Is there a flaw in West Search? (March 20, 2017), it appears that a West reference attorney has confirmed my conclusion that Westlaw does not offer as comprehensive a searching capability as Lexis.

In reading Mary Whisner’s (Reference librarian, Gallagher Law Library, University of Washington School of Law) research paper Comparing Case Law Retrievals, Mid-1990s and Today, Appendix B records an exchange between Whisner and a West reference attorney. Here’s the pertinent parts:

11/18/2016 01:14:35PM Agent (Anna Wiles): “Those searches seem to be maxing out.”

11/18/2016 01:14:51PM Agent (Anna Wiles): “Sometimes, the algorithm will cut off before 10,000.”

11/18/2016 01:23:26PM Agent (Anna Wiles): “If you run the search in all states and all federal, it will max out because it is a broad search.”

11/18/2016 01:23:53PM Agent (Anna Wiles): “If you narrow by a jurisdiction, the results will not max out.”

But Whisner was attempting to perform a fairly comprehensive search. Note that West Search sometimes will max out at under 10,000 documents too according to the West staffer.

More evidence that in an attempt to find the Holy Grail of legal research —  the uber precise search result — West Search may have sacrificed comprehensiveness. — Joe

10,000 documents: Is there a flaw in West Search?

10,000 documents is an awful lot. Truly a low precision, high recall search. But sometimes, one starts off searching very broadly because Westlaw and Lexis Advance provide a “search within results” option to narrow down initial search output. While I do not perform many broad searches in Westlaw, I have never once seen a figure higher than 10,000 documents in my search results. I have, however, seen “10,000+” documents in equally broad Lexis Advance searches on the same topic.  Unfortunately 10,000 documents appears to be a search results limit in Westlaw.

If an initial search pulls up 10,000 documents in Westlaw, there is no reason to believe all Westlaw documents identified by one’s search are really all the potentially relevant documents in the Westlaw database. Searching within the initial 10,000 documents search results would be, therefore, based on a seriously flawed subset of the Westlaw database, one defined by West Search, not one’s search logic. This is not the case in Lexis Advance where a broad search may yield 10,000+ documents for searching within initial results. If this is indeed a flaw in West Search’s output, one must conclude that Lexis Advance offers more comprehensive searching of its database than Westlaw. — Joe

Gorsuch confirmation hearings gear up: “Find as much information about the new Supreme Court nominee as possible.”

“The idea for the Gorsuch Project was born after law librarians from several universities and government offices faced a similar question from their patrons: ‘Find as much information about the new Supreme Court nominee as possible.'” — From the Gorsuch Project.

The Gorsuch project “is the result of the collaborative efforts of several libraries to research and collect a comprehensive set of materials relating to the Hon. Neil Gorsuch’s career on the 10th Circuit Court of Appeals. Majority opinions, dissents, and concurrences authored or joined by Gorsuch and references to his published work and speeches are presented here.” The academic law libraries involved are located at the Univ. of Illinois College of Law, the Univ. of Richmond School of Law, the Univ. of Virginia School of Law (host site of the Project), plus the Free Law Project and the US Railroad Retirement Board contributed to the project.

See also Neil M. Gorsuch a Law Library of Congress bibliography that was last updated February 2, 2017.

H/T to Michel-Adrien Sheppard’s Slaw post. — Joe

Help! New HeinOnline Knowledge Base launched

Here. See also Bonnie Hein’s blog post. — Joe

Is Wikipedia a reliable legal authority now?

Keith Lee identifies recent court opinions that cite (or reject) Wikipedia as an authority on Associate’s Mind. He writes:

Every Circuit has judicial opinions that cite Wikipedia as a reliable source for general knowledge. Who Ludacris is. Explaining Confidence Intervals. But some courts within the same Circuit will be dismissive of Wikipedia as a source of general information. There is no definitive answer. Judges seem to make determinations about Wikipedia’s reliability on a case-by-case basis. If you want to cite Wikipedia in a brief and not have a judge be dismissive of it, it’s probably worth your time running a quick search to see where the judge stands on the topic.

Hat tip to PinHawk’s Librarian News Digest on the PinHawk Blog. — Joe

Artificial intelligence in legal research

ILTA’s Beyond the Hype: Artificial Intelligence in Legal Research webinar was conducted last month and features ROSS Intelligence CEO and co-founder Andrew Arruda. The link takes you to the archived webinar. Interesting. — Joe

The big three legal research providers in the small law market

Lexis, Westlaw, and Fastcase are in a virtual tie in the small law market according to a recent survey conducted by the law practice management firm Clio. The results of the survey revealed the following small law market shares:

  1. Westlaw, 20.58 percent
  2. Fastcase, 20.35 percent
  3. LexisNexis, 20.21 percent

See the below pie chart and table for details.

Hat tip to Bob Ambrogi’ LawSites post. — Joe

 

Tracking Trump, there’s apps for that

In The Best Apps To Track Trump’s Legal Changes, Bob Ambrogi identifies three apps designed to monitor the Trump administration’s actions.

  1. The goal of Track Trump is “to isolate actual policy changes from rhetoric and political theater and to hold the administration accountable for the promises it made.”
  2. The Cabinet Center for Administrative Transition (CCAT) from the law firm Cadwalader, Wickersham & Taft collects “pronouncements, position papers, policy statements, and requirements as to legislative and regulatory change related to the financial service agenda of the President, the new administration and the new Congress. It tracks legislative developments, executive orders, policy positions, regulations, the regulators themselves, and relevant Trump administration news.”
  3. Columbia Law School’s Trump Human Rights Tracker follows the Trump administration’s actions and their implications for human rights.

— Joe

How likely is algorithmic transparency?

Here’s the abstract for Opening the Black Box: In Search of Algorithmic Transparency by Rachel Pollack Ichou (University of Oxford, Oxford Internet Institute):

Given the importance of search engines for public access to knowledge and questions over their neutrality, there have been many theoretical debates about the regulation of the search market and the transparency of search algorithms. However, there is little research on how such debates have played out empirically in the policy sphere. This paper aims to map how key actors in Europe and North America have positioned themselves in regard to transparency of search engine algorithms and the underlying political and economic ideas and interests that explain these positions. It also discusses the strategies actors have used to advocate for their positions and the likely impact of their efforts for or against greater transparency on the regulation of search engines. Using a range of qualitative research methods, including analysis of textual material and elite interviews with a wide range of stakeholders, this paper concludes that while discussions around algorithmic transparency will likely appear in future policy proposals, it is highly unlikely that search engines will ever be legally required to share their algorithms due to a confluence of interests shared by Google and its competitors. It ends with recommendations for how algorithmic transparency could be enhanced through qualified transparency, consumer choice, and education.

— Joe

Use Trump’s executive orders sourced on White House website with caution

In White House posts wrong versions of Trump’s orders on its website, USA Today reports that the texts of at least five Trump executive orders hosted on the White House website do not match the official text sent to the Federal Register. Quoting from the USA Today article, examples include:

► The controversial travel ban executive order suspended the Visa Interview Waiver Program and required the secretary of State to enforce a section of the Immigration and Naturalization Act requiring an in-person interview for everyone seeking a non-immigrant visa. But the White House version of the order referred to that provision as 8 U.S.C. 1222, which requires a physical and mental examination — not 8 U.S.C. 1202, which requires an interview.

► An executive order on ethical standards for administration appointees, as it appears on the White House website, refers to”section 207 of title 28″ of the U.S. Code. As the nonprofit news site Pro Publica reported last week, that section does not exist. The Federal Register correctly cited section 207 of title 18, which does exist.

— Joe

The Algorithm as a Human Artifact: Implications for Legal {Re}Search

Here’s the abstract for Susan Nevelow Mart’s very interested article The Algorithm as a Human Artifact: Implications for Legal {Re}Search (SSRN):

Abstract: When legal researchers search in online databases for the information they need to solve a legal problem, they need to remember that the algorithms that are returning results to them were designed by humans. The world of legal research is a human-constructed world, and the biases and assumptions the teams of humans that construct the online world bring to the task are imported into the systems we use for research. This article takes a look at what happens when six different teams of humans set out to solve the same problem: how to return results relevant to a searcher’s query in a case database. When comparing the top ten results for the same search entered into the same jurisdictional case database in Casetext, Fastcase, Google Scholar, Lexis Advance, Ravel, and Westlaw, the results are a remarkable testament to the variability of human problem solving. There is hardly any overlap in the cases that appear in the top ten results returned by each database. An average of forty percent of the cases were unique to one database, and only about 7% of the cases were returned in search results in all six databases. It is fair to say that each different set of engineers brought very different biases and assumptions to the creation of each search algorithm. One of the most surprising results was the clustering among the databases in terms of the percentage of relevant results. The oldest database providers, Westlaw and Lexis, had the highest percentages of relevant results, at 67% and 57%, respectively. The newer legal database providers, Fastcase, Google Scholar, Casetext, and Ravel, were also clustered together at a lower relevance rate, returning approximately 40% relevant results.

Legal research has always been an endeavor that required redundancy in searching; one resource does not usually provide a full answer, just as one search will not provide every necessary result. The study clearly demonstrates that the need for redundancy in searches and resources has not faded with the rise of the algorithm. From the law professor seeking to set up a corpus of cases to study, the trial lawyer seeking that one elusive case, the legal research professor showing students the limitations of algorithms, researchers who want full results will need to mine multiple resources with multiple searches. And more accountability about the nature of the algorithms being deployed would allow all researchers to craft searches that would be optimally successful.

Recommended. — Joe

Seven major themes about the algorithm era

Pew Internet’s report, Code-Dependent: Pros and Cons of the Algorithm Age, identifies seven major algorithm era themes (listed below). See also NPR’s Will Algorithms Erode Our Decision-Making Skills? — Joe

pi_2017_02_08_algorithms_0-01

AALL announces new product award

Congratulations to Ravel Law for winning AALL’s New Product Award for its legal analytics service. Ravel FAQ here. — Joe

Docket-based research needed to find “submerged precedent”

“[S]ubmerged precedent pushes docketology in an uncharted direction by identifying a mass of reasoned opinions—putative precedent and not mere evidence of decision-making—that exist only on dockets,” writes Elizabeth McCuskey (Toledo) in Submerged Precedent, 16 Nevada Law Journal ___ 2016 (forthcoming)[SSRN]. Professor McCuskey adds “[s]ubmerged precedent thus raises the specter that docket-based research may be necessary in some areas to ascertain an accurate picture of the law itself, not just trial courts’ administration of it.” Here the abstract for this very interesting article:

This article scrutinizes the intensely individual, yet powerfully public nature of precedent, inquiring which decisions are made available for posterity in the body of precedent and which remain solely with their authors and the instant parties. At its broadest level, this article investigates the intricate relationships among precedent, access, and technology in the federal district courts, examining how technology can operationalize precedent doctrine.

Theory and empiricism inform these inquiries. Drawing from a sample of district court decisions on Grable federal question jurisdiction, the study presented here identifies and explores the phenomenon of “submerged precedent” – reasoned opinions hidden on court dockets, and not included in the Westlaw or Lexis databases. The study detailed here found that submergence may obscure as much as 30% of reasoned law on Grable federal questions from the view of conventional research.

This article investigates the structural and institutional forces behind submergence, as well as its doctrinal implications. By effectively insulating some reasoned opinions from future use by judges and practitioners, the phenomenon of submerged precedent threatens to skew substantive law and erode the precedential system’s animating principles of fairness, efficiency, and legitimacy. Most urgently, the existence of submerged precedent suggests that Congress’s mandate for public access to federal precedents in the E-Government Act of 2002 lies unfulfilled in important respects. The application of precedent theory informed by empirical observation suggests a more thoughtful approach to technology and public access to precedent.

— Joe

Is a uniform system of citation an open-source feature of our legal system’s infrastructure?

In The new (and much improved) ‘Bluebook’ caught in the copyright cross-hairs (The Volokh Conspiracy), David Post writes that “[w]ar is brewing over the most boring piece of intellectual property imaginable: the ‘Bluebook… .’” At issue is the alpha release of NYU Law professor Christopher Sprigman and Carl Malamud’s open-source Baby Blue’s Manual of Legal Citation (Public.Resource.Org, January 1, 2016). From Baby Blue’s Preface:

It is important to understand, when we are talking about “The Bluebook, A Uniform System of Citation,” that we are talking about two different things. There is a product, a spiral-bound booklet that sells for $38.50, which is accompanied by a rudimentary web site available to purchasers of the product.

Underlying that product, however, is something much more basic and fundamental, a uniform system of citation. Unpaid volunteers from a dozen law schools, under the stewardship of four nonprofit student-run law reviews, have labored mightily to reach a consensus standard for the citation of legal materials. This open consensus standard was developed, with no compensation to the authors, for the greater benefit of the legal system of the United States. By clearly and precisely referring to primary legal materials, we are able to communicate our legal reasoning to others, including pleading a case in the courts, advocating changes in legal policy in our legislatures or law reviews, or simply communicating the law to our fellow citizens so that we may be better informed.

We do not begrudge the Harvard Law Review Association one penny of the revenue from the sale of their spiral-bound book dressed in blue. However, we must not confuse the book with the system. There can be no proprietary claim over knowledge and facts, and there is no intellectual property right in the system and method of our legal machinery. The infrastructure of our legal system is a public utility, and belongs to all of us.

Kathryn Rubino’s Controversy At Harvard Law Over The Bluebook? (ATL) summarizes recent developments. — Joe

End Note: Download Sprigman et anon. al., Baby Blue’s Manual of Legal Citation (Public.Resource.Org, 2016).

RIP Westlaw Classic and WestlawNext

Earlier this month Thomson Reuters announced that WestlawNext has been renamed Thomson Reuters Westlaw “in light of Westlaw Classic’s retirement.” Hat tip to LLB readers who called the name change announcement to my attention after reading this post. — Joe

Good old fashioned WEXIS competition

Just in case you haven’t noticed, the tab and sidebar war has returned for our very expensive legal search vendors. Without the branding could you tell which is WestlawNext and which is Lexis Advance? Ok, the key number symbol gives it away.

BTW when did Thomson Reuters drop “next?” — Joe

westlaw

lexis advance

What The [Bad Word] Is This Supposed Be?

I was helping a cite checker with Pennsylvania Consolidated Statutes when I came across this free site called WestlawNext State Government Sites.  I’m not sure what TR is trying to do with this as FindLaw still exists.  The collection features a small but haphazard collection of primary and quasi-primary materials from what I would describe as all over the place.  The limited number of states represented are far from comprehensive.  The materials presented are just as puzzling.  Each collection has different ways to conduct searches.  Take a look.  Anybody with reactions please let me know.

Mark

UELMA adoption does not correlate to barrier free access says Glassmeyer report

Sarah Glassmeyer has released the results of a survey of state primary law. State Legal Information Census (PDF). Here’s the abstract:

This report presents findings from a survey of state level primary legal information.   Primary legal information includes code (codified statutes passed by state legislatures), regulations (codified collections of rules passed by administrative agencies) and case law (appellate court decisions).  This survey was done with the goal of reviewing the free and open status of this legal information.

Findings indicate that there exists at least 14 barriers to accessing legal information.  These barriers exist for both the individual user of a resource for personal research as well as a institutional user that would seek to republish or transform the information.   At the time of the census, no state provided barrier-free access to their legal information.

Furthermore, analysis of the legal information provided by states shows that it is impossible to do any but the most basic of legal research for free using state provided legal information sources.  Current collections allow for citation retrieval and some basic keyword searching.  No state allows for federated searching of legal information collections.   The universal lack of a citator for case law renders these collections, as a practical matter, useless and would be considered malpractice for a legal practitioner to rely upon.   There is also a worrisome lack of archival material maintained by states.  Not only does this affect one’s ability to do comprehensive research, but it also could be indicative of a lack of adequate preservation.

States were scored and ranked based on the openess of their legal publication practices.  On a scale of 0 – 24, the highest score achieved was 18.  The lowest was 8 and the median was 14.  These results were compared against the adoption of the Uniform Electronic Legal Information Act (UELMA) and it was found that adoption of UELMA did not correlate to barrier free publication practices.

— Joe