Network-Enabled Sequential Data Acquisition
43 Pages Posted: 27 Apr 2022
Date Written: April 18, 2022
Abstract
Consumer data are strategic assets for digital platforms. A significant challenge on these platforms is accurately inferring users' preferences from noisy and sparse data with limited acquisition budget, especially when future reward functions can take various forms and may be unknown during the data acquisition process. Our work addresses this by introducing a new sequential data acquisition problem aimed at improving the long-term flexible rewards of digital platforms within these budget constraints. We propose an algorithm, named RI-Net (i.e., Reward-Information NETwork-amplified), that evaluates both the expected reward and value of information for acquiring data points in the low-rank rating matrix, while simultaneously utilizing homophily and structural cues from consumer networks and their information diffusion dynamics. We leverage consumer networks to address the challenges of limited data samples and constrained acquisition budgets by utilizing encoded consumer preferences and understanding their implications on network effects. To learn user representations, we introduce a locally smooth mechanism that constructs dynamic neighborhoods around individual users. This mechanism allows for temporal variations in the predictive power of neighboring users with respect to the focal users. Extensive evaluations on three canonical real-world recommendation datasets demonstrate that RI-Net outperforms existing methods. Adopting our data acquisition strategy that balances reward with information value while leveraging the consumer network can improve the cumulative rewards by 7.1% to 14.3%.
Keywords: Data acquisition, recommender system, design science, consumer network, machine learning
Suggested Citation: Suggested Citation
Cao, Junyu and Leng, Yan, Network-Enabled Sequential Data Acquisition (April 18, 2022). Available at SSRN: https://ssrn.com/abstract=4086999 or http://dx.doi.org/10.2139/ssrn.4086999
Do you have a job opening that you would like to promote on SSRN?
Feedback
Feedback to SSRN