Solving High-Dimensional Dynamic Programming Using Set Transformer
39 Pages Posted: 27 Jan 2025 Last revised: 17 Apr 2025
Date Written: November 15, 2024
Abstract
This paper proposes a new approach to solving high-dimensional dynamic programming (HDDP) problems in economics by leveraging the Set Transformer architecture introduced by Lee et al. [2019]. Traditional dynamic programming methods often face challenges due to the curse of dimensionality, and existing neural network architectures, such as Deep Sets, struggle to capture complex, nonlinear interactions in highdimensional data. The Set Transformer, with its advanced attention mechanisms and induced set attention blocks, overcomes these limitations by efficiently approximating permutation-invariant functions, making it particularly well-suited for heterogeneous agent models. Using a methodology based on minimizing Euler residuals, this study ensures that the model adheres to economic theory while achieving superior accuracy and scalability. Comparisons with Deep Sets across varying model complexities, including scenarios with linear-quadratic benchmarks and more nonlinear settings, demonstrate the Set Transformer's significant advantages. These results underscore its potential to advance computational economics by providing robust and efficient tools for analyzing complex economic dynamics.
Keywords: High-Dimensional Dynamic Programming (HDDP), Neural Networks, Set Transformer, Permutation Invariance, Heterogeneous Agent Models
Suggested Citation: Suggested Citation