Automated Legal Guidance

67 Pages Posted: 10 Mar 2020 Last revised: 25 Mar 2021

See all articles by Joshua D. Blank

Joshua D. Blank

University of California, Irvine School of Law

Leigh Osofsky

University of North Carolina (UNC) at Chapel Hill

Date Written: March 1, 2020


Through online tools, virtual assistants and other technology, governments increasingly rely on artificial intelligence to help the public understand and apply the law. The Internal Revenue Service, for example, encourages taxpayers to seek answers regarding various tax credits and deductions through its online “Interactive Tax Assistant.” The U.S. Army directs individuals with questions about enlistment to its virtual guide, “Sgt. Star.” And the U.S. Citizenship and Immigration Services suggests that potential green card holders and citizens speak with its interactive chatbot, “Emma.” Through such automated legal guidance, the government seeks to provide advice to the public at a fraction of the cost of employing human beings to perform these same tasks.

This Article offers one of the first critiques of these new systems of artificial intelligence. It shows that automated legal guidance currently relies upon the concept of “simplexity,” whereby complex law is presented as though it is simple, without actually engaging in simplification of the underlying law. While this approach offers potential gains in terms of efficiency and ease of use, it also causes the government to present the law as simpler than it is, leading to less precise advice, and potentially inaccurate legal positions. Using the Interactive Tax Assistant as a case study, the Article shows that the use of simplexity in automated legal guidance is more powerful and pervasive than in static publications because it is personalized, non-qualified and instantaneous. Further, it argues that understanding the costs as well as the benefits of current forms of automated legal guidance is essential to evaluating even more sophisticated, but also more opaque, automated systems that governments are likely to adopt in the future.

With these considerations in mind, the Article offers three recommendations to policymakers. First, it argues that governments should prevent automated legal guidance from widening the gap between access to legal advice enjoyed by high-income and by low-income individuals. Second, it argues that governments should introduce more robust oversight and review processes for automated legal guidance. Finally, it argues that the government should allow individuals to avoid certain penalties and sanctions when they have taken actions or claimed legal positions in reliance upon automated legal guidance. Unless these steps are taken, we believe that the costs of these automated legal guidance systems may soon come to outweigh their benefits.

Keywords: Artificial Intelligence, Automation, Tax Simplification, Taxpayer Communication, Transparency, IRS, Tax Compliance, Plain Language, Simplexity

JEL Classification: H20, H23, H24, H25, H26, H29, K34

Suggested Citation

Blank, Joshua D. and Osofsky, Leigh, Automated Legal Guidance (March 1, 2020). 106 Cornell Law Review 179 (2020), UC Irvine School of Law Research Paper Forthcoming, UNC Legal Studies Research Paper, Available at SSRN:

Joshua D. Blank (Contact Author)

University of California, Irvine School of Law ( email )

401 E. Peltason Dr.
Ste. 1000
Irvine, CA 92697-1000
United States


Leigh Osofsky

University of North Carolina (UNC) at Chapel Hill ( email )

102 Ridge Road
Chapel Hill, NC NC 27514
United States

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Abstract Views
PlumX Metrics