Keeping ChatGPT a Trade Secret While Selling It Too
Berkeley Technology Law Journal (forthcoming 2025)
90 Pages Posted: 2 Jul 2024
Date Written: February 01, 2024
Abstract
Generative artificial intelligence products such as ChatGPT raise novel issues for trade secret law. But one of the most important issues is an old one: How to sell an information good, like computer software, while also maintaining trade secrecy protection for the underlying content? When a company wishes to sell a new technology to the public, the normal recourse is to obtain a patent. Patents require public disclosure and end after a fixed term of years. However, based on decades of precedents established for software, generative AI companies will be able to rely on trade secret law instead-maintaining indefinite protection for their technology, even as they profit from making it widely available to the public, and even after reverse engineering becomes technically feasible. This is what many companies did with closed-source software, and this is what developers of some generative AI models-including ChatGPT-are doing today. They are releasing the models in a “closed-source” format that hides algorithms, code, training data, and underlying model architecture from users. And they are attaching contractual provisions—called “terms of use” or “end user license agreements” (EULAs)—that limit users’ ability to reverse engineer information about how the models work or share that information with others. Some of these agreements, including ChatGPT’s, even contain noncompete provisions. If liability for breaching these provisions were limited to breach of contract, there would be less cause for alarm. However, some case law—and some state statutes—indicate that reverse engineering trade secrets in breach of an anti-reverse-engineering clause can give rise to trade secret liability as well, because breach of the contract transforms otherwise-lawful reverse engineering into an “improper means” of acquiring trade secrets. The prospect of trade secret liability for what should be, at worst, breach of contract is alarming. It means prevailing plaintiffs can obtain trade secret law remedies, not just contract law remedies, and it means that liability can extend to third parties who did not even sign the contract. For example, if someone reverse engineers information about ChatGPT in violation of a boilerplate terms of use, and then shares that information with someone else, who publishes the information on the internet, both of these actors could be liable for trade secret misappropriation. Fortunately, there is a solution. In the Defend Trade Secrets Act (DTSA) of 2016, Congress made clear that reverse engineering is legal under federal trade secret law and cannot be considered an “improper means” of acquiring a trade secret. The mere presence of a contract purporting to prohibit reverse engineering cannot change this rule. A state law that holds otherwise is preempted by federal trade secret law pursuant to the Supremacy Clause of the Constitution. The upshot is that, in many circumstances, reverse engineering a publicly-distributed generative AI model—or a traditional software product—is not trade secret misappropriation, regardless of the presence of a boilerplate anti-reverse-engineering clause. This doctrinal approach will make sure that, once a widely-available product can easily and cheaply be reverse engineered by members of the general public, companies cannot maintain trade secret protection indefinitely through contract.
Keywords: trade secrets, AI, ChatGPT, generative AI, intellectual property, patents, secrecy, contracts, noncompetes, non-compete agreements, confidentiality agreements, reverse engineering, anti-reverse engineering clauses, EULAs, Terms of Use
Suggested Citation: Suggested Citation