Pay-to-play: Access to Justice in the Era of AI and Deepfakes
58 Pages Posted: 1 Mar 2024 Last revised: 27 Jan 2025
Date Written: February 10, 2024
Abstract
The American legal adversarial system’s effective functioning assumes that parties will have an equal opportunity to present competing claims and the truth will emerge based on the evidence. However, the system fails if a party cannot afford to develop and offer their evidence. As Artificial Intelligence (AI) and deepfakes continue to infiltrate society, they inevitably invade legal proceedings. Cases involving AI and deepfake evidence will require courts and parties to employ expensive expert analysis to determine the authenticity of the technological evidence. At the same time, problems with access to justice persist because many Americans with legal problems cannot afford to pay litigation expenses. AI and deepfake evidence in legal proceedings exacerbate the ongoing unmet need for legal services, making a bad situation with access to justice worse.
The lack of resources to pay for expert witness analysis may ultimately mean that some litigants in deepfakes and AI cases will be denied access to the legal system. For example, imagine a scenario in which proving a digital image is a deepfake is dispositive to a plaintiff’s case, but the plaintiff cannot afford to pay for digital forensic expert witness analysis to prove it. As a result, the plaintiff’s attorney may have no option but to withdraw the evidence. In that instance, the litigation cost of proving the audio-visual image is fake denies the plaintiff the opportunity to obtain justice. The proliferation of AI and deepfakes in litigation, coupled with the cost-prohibitive expense of litigating them, provides the necessary context for the central question in this Article—how to maintain access to justice in cases involving AI and deepfake evidence. The introduction of deepfake and AI evidence in legal proceedings will trigger a failure of the adversarial system because the law currently offers no effective solution to secure access to justice to pay for this evidence for those who lack resources.
This Article addresses matters not previously considered in the prior scholarship on AI, deepfakes, and access to justice. It is the first to explore access to justice and AI evidence together, locating them within the historical and current framework of laws and practice norms. The Article proposes a novel solution--the proponent of the deep fake allegation should pay the expert witness costs of litigating the claim. It further proposes that if the court determines that the deepfake claim is non-frivolous, nor asserted for an improper purpose, and if financial need is demonstrated, the deepfake litigation costs should be allocated to the other party. It also urges overruling an outdated Supreme Court case law, which restricts recovery of expert witness evidence. The Article also argues the definition of recoverable litigation costs should be expanded to include expert evidence associated with litigating AI evidence. It offers a unique approach to securing access to justice, leveling the playing field so everyone can have a fair opportunity to litigate deepfake and AI cases. Deploying the new tools and mechanisms will safeguard the justice system’s truth-seeking function.
Keywords: Access to justice, Artificial Intelligence, AI, Deepfakes, Expert witness, digital forensic expert, technological evidence, evidence, litigation expenses, cost shifting, prevailing parties, digital evidence, access to legal services, cost allocation, litigation expenses, costs, experts, deepfake,
JEL Classification: K00, K1, K2, K3, K4
Suggested Citation: Suggested Citation