'The Chatbot Will See You Now': Mental Health Confidentiality Concerns in Software Therapy

58 Pages Posted: 19 Jul 2018

See all articles by Scott Stiefel

Scott Stiefel

Loyola University Chicago, Corboy School of Law, Students

Date Written: May 1, 2018

Abstract

Woebot, a chatbot leveraging cognitive behavioral therapy techniques, promises to provide a new area of software therapy, in which users can use software to address feelings of anxiety or depression. Similar technologies claim to provide "chat" or "support", but not treatment so as to avod FDA oversight. Regardless, the current privacy framework of the U.S. provides no confidentiality obligations on these apps, even though states provide similar restrictions on use and disclosure on licensed mental health professionals.

This Article argues that new legislation is necessary to protect the use and disclosure of information submitted by consumers to chatbots intended for use as electronic therapists. It looks at existing state laws, the senstitivity of mental health information and need for greater confidentaility protections, as well as the Supreme Court opinions in Jaffee v. Redmond as a guide. Ultimately, it concludes that cognitive behaviorally trained therapy A.I. intended for use by consumer as such should be subject to heightened confidentiality requirements.

Keywords: privacy, mental health, medical device, device, mHealth, digital health, confidentiality

Suggested Citation

Stiefel, Scott, 'The Chatbot Will See You Now': Mental Health Confidentiality Concerns in Software Therapy (May 1, 2018). Available at SSRN: https://ssrn.com/abstract=3166640 or http://dx.doi.org/10.2139/ssrn.3166640

Scott Stiefel (Contact Author)

Loyola University Chicago, Corboy School of Law, Students ( email )

25 E. Pearson
Room 1041
Chicago, IL 60611
United States

Register to save articles to
your library

Register

Paper statistics

Downloads
178
Abstract Views
622
rank
174,030
PlumX Metrics