Culture & Society

Without reforms, AI could deepen America’s justice gap, expert says

Texas A&M law professor Milan Markovic warns that generative artificial intelligence promises legal help at scale, but it has the potential to amplify inequalities in the justice system.

In the United States, civil justice is often a solo endeavor for low- and middle-income Americans, with an estimated 92 percent of people with civil legal problems receiving no formal legal help.

Some prominent lawyers, judges and scholars have pointed to artificial intelligence as a long-awaited solution to America’s “access to justice” crisis by increasing efficiencies and demystifying the legal process. But Texas A&M University law professor Milan Markovic urges caution.

Markovic, a professor of law and presidential impact fellow at Texas A&M’s School of Law, says that while generative AI tools like ChatGPT and other large language models will be increasingly important sources of legal assistance for underserved populations, “techno-optimists” are too bullish on AI’s potential to help. While AI may democratize access to legal information, it also risks reinforcing — or even exacerbating — existing inequalities, he argues in a forthcoming article in the Ohio State Law Journal.

AI’s promise — and potential pitfalls

Proponents contend that AI will expand access to law, making it far easier for people to navigate the legal system and vindicate their rights. Even Chief Justice John Roberts has hailed AI’s “welcome potential to smooth out any mismatch between available resources and urgent needs in our court system.”

This optimism, Markovic said, risks obscuring the deeper realities of America’s adversarial justice system.

He acknowledges the transformative potential for some aspects of generative AI, especially for everyday legal problems like responding to legal communications or understanding basic rights. But without reforms, he says, AI could entrench inequality rather than dismantle it.

AI may slash transaction costs by automating tasks traditionally performed by lawyers, like drafting contracts and other legal documents, but Markovic said this will make it easier for “sophisticated legal actors” like landlords, debt collectors and corporations to initiate legal action. Repeat litigants will be able to pursue claims more aggressively and at higher volumes, increasing the legal burdens on those least equipped to respond.

“This is not just a problem for people who are going to be hurt by AI — particularly underrepresented communities — it’s a problem for the legal system itself,” Markovic said.

Asymmetric information and AI’s limits

Markovic said one of the most common misconceptions about the access-to-justice crisis is that it is primarily driven by the cost of legal services.

“There are many barriers that prevent people from exercising their rights beyond the cost of lawyers, and probably the most fundamental is that people don’t even know they have legal problems,” he said. “This is the kind of thing you only really know by working with these populations.”

People often think of workplace, housing or family disputes in everyday terms — not as potential violations of their legal rights. This gap in legal awareness, Markovic said, is a significant barrier that AI online is unlikely to overcome.

“It’s extremely difficult to make people cognizant of the fact that they have legal problems and then get them to actually act on their rights,” he said. As a result, AI tools are more likely to benefit people who already have the time, resources and legal awareness to investigate potential claims.

Another issue is asymmetric information, a condition in which service providers have far more knowledge than consumers. In legal markets, this imbalance makes it difficult for people to know what help they need or whether the advice they receive is reliable. This dynamic becomes especially problematic when individuals turn to generative AI for legal guidance. Even when AI produces confident-sounding answers, users may lack the expertise to evaluate the accuracy of outputs or their limitations.

“AI systems tell you that they’re not an attorney and you should consult a lawyer, so if you rely on their advice, there’s not much of a recourse if you’re harmed,” Markovic said. “When you have a market that is rife with asymmetric information, it really puts consumers at a disadvantage and advantages unscrupulous providers.”

Generative AI’s tendency to produce false or entirely fabricated information in its outputs — known as hallucinations — also poses serious risks to courts and litigants. Markovic said because legal claims depend on accurate citations to statutes and cases, hallucinated sources can undermine judicial decision-making. Lawyers have already been sanctioned for citing AI-generated cases that don’t exist, and courts are grappling with how to address the issue.

“It’s a huge problem, and the only way for it to stop is for lawyers and others to vet every single citation in a legal filing,” Markovic said. “But that’s time-consuming, and now we’re talking about undercutting the efficiencies that AI is supposed to provide.”

Proposed reforms

To prevent AI from amplifying existing inequalities, Markovic proposes two reforms.

First, he calls for training publicly funded “justice tech workers” to help underrepresented individuals use AI responsibly. These workers could steer people toward vetted, nonprofit legal tools and help review filings to prevent factual or legal errors from entering the court system.

Second, Markovic urges courts to strengthen requirements for verifying factual claims and legal authorities in cases that commonly involve unrepresented parties. Raising standards in high-volume litigation, he argues, would reduce abuse and protect the integrity of the justice system.
As AI adoption accelerates, Markovic predicts there will be “a lot of hardship and chaos along the way.” He hopes the legal system will take a more deliberate approach than other sectors in how best to integrate these processes.

“We have an oath not only to our clients but to the legal system,” he said. “There’s a lot of pressure on lawyers and courts and law schools to deploy AI immediately.”