'The Chatbot Will See You Now': Mental Health Confidentiality Concerns in Software Therapy
58 Pages Posted: 19 Jul 2018
Date Written: May 1, 2018
Woebot, a chatbot leveraging cognitive behavioral therapy techniques, promises to provide a new area of software therapy, in which users can use software to address feelings of anxiety or depression. Similar technologies claim to provide "chat" or "support", but not treatment so as to avod FDA oversight. Regardless, the current privacy framework of the U.S. provides no confidentiality obligations on these apps, even though states provide similar restrictions on use and disclosure on licensed mental health professionals.
This Article argues that new legislation is necessary to protect the use and disclosure of information submitted by consumers to chatbots intended for use as electronic therapists. It looks at existing state laws, the senstitivity of mental health information and need for greater confidentaility protections, as well as the Supreme Court opinions in Jaffee v. Redmond as a guide. Ultimately, it concludes that cognitive behaviorally trained therapy A.I. intended for use by consumer as such should be subject to heightened confidentiality requirements.
Keywords: privacy, mental health, medical device, device, mHealth, digital health, confidentiality
Suggested Citation: Suggested Citation