Time and the Value of Data

43 Pages Posted: 27 Aug 2020 Last revised: 11 Nov 2021

See all articles by Ehsan Valavi

Ehsan Valavi

Massachusetts Institute of Technology ; Massachusetts Institute of Technology (MIT) - Sloan School of Management

Joel Hestness

Cerebras Systems

Newsha Ardalani

Baidu - Baidu Research

Marco Iansiti

Harvard University - Business School (HBS)

Date Written: Novemeber 1, 2021


Managers often believe that collecting more data will continually improve the accuracy of their machine learning models. However, we argue in this paper that when data lose relevance over time, it may be optimal to collect a limited amount of recent data instead of keeping around an infinite supply of older (less relevant) data. In addition, we argue that increasing the stock of data by including older datasets may, in fact, damage the model's accuracy. Expectedly, the model's accuracy improves by increasing the flow of data (defined as data collection rate); however, it requires other tradeoffs in terms of refreshing or retraining machine learning models more frequently.
Using these results, we investigate how the business value created by machine learning models scales with data and when the stock of data establishes a sustainable competitive advantage. We argue that data's time-dependency weakens the barrier to entry that the stock of data creates. As a result, a competing firm equipped with a limited (yet sufficient) amount of recent data can develop more accurate models. This result, coupled with the fact that older datasets may deteriorate models' accuracy, suggests that created business value doesn't scale with the stock of available data unless the firm offloads less relevant data from its data repository. Consequently, a firm's growth policy should incorporate a balance between the stock of historical data and the flow of new data.
We complement our theoretical results with an experiment. In the experiment, we use the simple yet widely used machine learning task known as next work prediction. We empirically measure the loss in accuracy of a next word prediction model trained on datasets from various time periods. Our empirical measurements confirm the economic significance of the value decline over time. For example, 100MB of text data, after seven years, becomes as valuable as 50MB of current data for the next word prediction task.

Keywords: economics of AI, machine learning, non-stationarity, perishability, value depreciation

Suggested Citation

Valavi, Ehsan and Hestness, Joel and Ardalani, Newsha and Iansiti, Marco, Time and the Value of Data (Novemeber 1, 2021). Harvard Business School Strategy Unit Working Paper No. 21-016, Available at SSRN: https://ssrn.com/abstract=3680910 or http://dx.doi.org/10.2139/ssrn.3680910

Ehsan Valavi (Contact Author)

Massachusetts Institute of Technology ( email )

77 Massachusetts Avenue
Cambridge, MA 02139
United States

Massachusetts Institute of Technology (MIT) - Sloan School of Management ( email )

Joel Hestness

Cerebras Systems ( email )

Newsha Ardalani

Baidu - Baidu Research ( email )


Marco Iansiti

Harvard University - Business School (HBS) ( email )

Soldiers Field Road
Morgan 270C
Boston, MA 02163
United States

Do you have a job opening that you would like to promote on SSRN?

Paper statistics

Abstract Views
PlumX Metrics