AI pitfalls that could predict the outcome of court cases


Join today’s top leaders online at the Data Summit on March 9. Register here.

Companies have long sought out technologies that promise an advantage in fighting litigation. For most businesses, case processing is a major drain on resources. In 2020, American companies spent a total of $22.8 billion on litigation; Law firm Fulbright & Jaworski estimated in 2005 that nearly 90% of businesses are engaged in some type of litigation, and that the average business balances a record of 37 lawsuits.

With the democratization of AI and analytics tools, it was perhaps inevitable that startups would start applying predictive techniques to the legal realm, especially given the huge market opportunity. (According to Statista, legal technology segment revenue could reach $25.17 billion in 2025.) For example, Ex Parte, a predictive analytics company founded by former attorney Jonathan Klein, claims to use AI and machine learning to predict the outcome of litigation and recommend actions companies can take to “maximize their chances of winning.”

But experts doubt that AI can predict events as complex as the course of court cases. As Mike Cook, a member of the Knives & Paintbrushes research collective, told VentureBeat, there are a number of factors in litigation that the data used to “teach” an AI system might not capture. which would result in erroneous or potentially biased predictions. “You can certainly train an AI to predict things – you can train an AI to predict anything you want – but it doesn’t always learn what you think it learns,” he said in a statement. interview. “When it comes to legal cases, there’s a lot we don’t see, and a lot we can’t give an AI to train, which means it could end up only learning. ‘part of the picture.’

A growing market

Ex Parte isn’t the only provider that claims to be able to predict the outcome of court cases. Blue J Legal, based in Toronto, Canada, claims it can estimate litigation outcomes with 90% accuracy, relying on models trained on relevant precedent corpora and the fact model of a case. ArbiLex – a competitor – focuses on arbitrage, presenting companies with metrics such as the cost of a case and the likelihood of who might win.

“The legal market has seen an explosion of AI products, particularly in machine learning and natural language processing. Uses include improving contract management, gaining insight into business data, ‘operation of the legal department and law firm and the analysis of American public law,” Ron Friedmann, senior research director at Gartner, told VentureBeat via email. “The American legal system has a vast volume of publicly available statutes, including court decisions, agency decisions, petitions and briefs. Beginning in the 1980s, portions of US law became available online, primarily for document retrieval. Around 2010, startups emerged that offered deeper insight into publicly available law.

Ex Parte — which recently raised $7.5 million in Series A funding, including from Ironbound Partners and R8 Capital from former Illinois Governor Bruce Rauner — generates recommendations for any corporate litigation , such as whether to settle and who claims to argue, where to file or defend a lawsuit, and which attorneys and law firms might offer the best chance of success. Klein says the platform can correctly predict the outcome of cases about 85% of the time.

“There are many technical and conceptual challenges associated with building a robust tracking model. Legal data is inherently disparate, unstructured and semantic,” Klein said in a statement. “We solved these problems by combining a highly specialized understanding of the legal field with advanced expertise in artificial intelligence, machine learning and natural language processing. Our mission is to be the world’s leading provider of data-driven decision-making solutions in the legal field and to provide our clients with a winning advantage.

Recent history, however, is replete with examples of how problematic data can adversely influence AI legal tech predictions.

The US justice system has embraced AI tools that were later found to have biases against defendants from certain demographic groups. Perhaps the most infamous of these is Northpointe’s Correctional Offender Management Profile for Alternative Sanctions (COMPAS), which is designed to predict a person’s likelihood of becoming a repeat offender. A ProPublica report found that COMPAS was much more likely to incorrectly judge black defendants to be at higher risk of recidivism than white defendants, while flagging white defendants as low risk more often than black defendants.

Published last December, a separate study by researchers from Harvard and the University of Massachusetts found that the Public Safety Assessment (PSA), a risk assessment tool that judges can choose to use to decide if a defendant should be released before a trial, tends to recommend too harsh sentences. According to the researchers, PSA is also likely to impose cash bail on arrested men compared to arrested women – a potential sign of gender bias.

In July 2018, more than 100 community and civil rights organizations, including the ACLU and NAACP, signed a statement urging against the use of algorithmic risk assessment tools such as COMPAS and PSA.

Potential risks

Of course, the stakes are lower in the cases that Ex Parte and other enterprise-focused predictive analytics startups are currently handling. (Klein says Ex Parte’s clients include hedge funds, law firms, insurance companies, litigation funding firms, and universities.) And researchers like Cook recognize that some litigation, like routine real estate disputes and other “generic” legal work, could be at your fingertips. the real of the possibility of predicting. But Cook warns that if the technology were to be widely adopted, it could create a “strange” and unpredictable feedback loop as future AI learns from the outcomes of cases predicted by today’s AI.

“One thing we haven’t explored properly yet with AI systems like this is the impact they have on their own ecosystem just by existing. If this system were to become the norm, 20 years from now, we would train these systems on the outcomes of cases decided by the AI… So even if it works at the moment, it will potentially create some weird situations in the future,” Cook said.”[I]If you imagine that applying to boring, low-stakes, innocuous legal stuff, maybe that’s not so bad. But if we imagine it expanding to huge lawsuits, cases involving the lives of real people, and big issues, I think it becomes a much sadder and unpredictable idea.

Os Keyes, an AI ethicist at the University of Washington, also expressed concern about how case-predictive AI could perpetuate inequities in the justice system. In a future where most of the wealthiest customers – whether businesses or individuals – are tapping into AI-powered case recommendations, assuming the technology works, those who can’t availing of the same could be disadvantaged.

Well-funded defendants already have advantages in the justice system, as trial attorney Kiernan McAlpine pointed out in a recent Quora post. Affluent clients can afford to settle cases they are likely to lose and pay expensive discovery and pre-trial attorneys. They also have the budgets to pay experts with high-level knowledge and testimony skills, as well as firms with arcane legal knowledge.

Some experts say these benefits hold true even at the highest levels of the justice system. Adam Cohen, author of Supreme Inequality: The Supreme Court‘s 50-Year Battle for a More Unjust America, argues that conservative Supreme Court justices — who rarely vote to overturn the convictions of poor defendants — have shown clear sympathy for the wealthy. A study found that conservative judge Antonin Scalia voted for defendants in about 7% of non-white-collar criminal cases and 82% of white-collar cases.

“The result of [this AI case-predicting technology] working will be – and I don’t say this lightly – socially appalling,” Keyes told VentureBeat via email. “[W]We already know that there is a big difference in outcomes between poor and indigent clients and wealthy clients, and a big part of that is resourcing. The promise of this [technology] …is to make matters worse – to pile even more resources into the corner of those who already have them.

VentureBeat’s mission is to be a digital public square for technical decision makers to learn about transformative enterprise technology and conduct transactions. Learn more


About Author

Comments are closed.