Media-Rich Input Application Liability
41 Pages Posted: 13 May 2010 Last revised: 27 Oct 2012
Date Written: May 12, 2010
Until recently, the media-richness of online interactions was mostly unidirectional; multimedia content was delivered by the service provider to the user. Input from the user came almost exclusively in the form of text. Even when searching the internet for images or audio, a user typically entered text into a search engine. In addition, search engines indexed multimedia content not by analyzing the content itself, but by the text surrounding it. This is rapidly changing. With the rise of extendible multimedia capable smartphones and wireless broadband, applications that allow users to search using non-textual inputs are quickly becoming popular. For example, Google Goggles takes an input for a search an image taken using the camera built into their phone; Shazam and Midomi allow users to search for music by transmitting a short clip recorded with their phone. Some of these applications even engage users in helping index content by having them sing specific songs into their service. Finally, video games such as The Beatles: Rock Band take actual voices and instrument-like controls as input and match them to the music of popular songs. This paper looks at the copyright implications of these applications, which I term Media-Rich Input Applications (MRIAs). This article examines how these unique features of MRIAs interact with current copyright doctrine, and how the lack of protection for users may discourage innovation in this new and exciting area of technology. This article proposes a new user safe harbor that balances the interests of users in using MRIAs with the interests of copyright owners in protecting their exclusive rights.
Suggested Citation: Suggested Citation