Legal Analytics Products Deliver Widely Divergent Results, Study Shows

LawSites
This post was originally published on this site

I have made no secret of the fact that I consider litigation analytics to be one of the most important technologies to have gained traction in recent years. Writing about analytics a year ago on Above the Law, I titled the post, This Tech Can Turn the Tables in Litigation. In my year-end summary of the most important legal tech developments of 2018, my top item was “analytics become essential,” noting, “We could be nearing a point where it would be malpractice for a lawyer not to use analytics.”

While all of that still holds true, and while the technology has continued to evolve and the array of products to expand in the year since then, it is important for legal professionals to understand that litigation analytics is still a developing technology, and that there are weaknesses in both the technology and the data on which they rely.

That means that these products differ not only in the kinds of analytics they offer, but also in the results they deliver.

The differences among these products were dramatically highlighted by a study conducted earlier this year by a group of law librarians who compared federal court results across seven legal analytics products. The products they compared were: Bloomberg Law, Docket Alarm Analytics Workbench (from Fastcase), Docket Navigator, Lex Machina, Lexis Context, Thomson Reuters Monitor Suite and Westlaw Edge.

Consider this seemingly simple research query: “In how many cases has Irell & Manella appeared in front of Judge Richard Andrews in the District of Delaware?” When the librarians tested that query across the seven products, they got widely different answers.

The librarians tested 16 “real world” research questions across the seven products, with similarly varying results. Although they have not yet published their findings, four of the participants — Diana Koppang, director of