Fitting Gamma Mixture Density Networks with Expectation-Maximization Algorithm
36 Pages Posted: 23 Nov 2020
Date Written: October 5, 2020
Abstract
We discuss how mixtures of Gamma distributions with shape and rate parameters depending on explanatory variables can be fitted with neural networks. We develop two versions of the EM algorithm for fitting Gamma Mixture Density Networks which we call the EM network boosting algorithm and the EM forward network algorithm. The key difference between the EM algorithms is how we pass the information about the trained neural networks and the predicted parameters between the iterations of the EM algorithm. We validate our EM algorithms and test different methods of how the algorithms can be efficiently applied in practice. Our algorithms work for general mixtures of any distribution types that have closed form densities.
Keywords: Expectation-Maximization, neural networks, boosting, mixtures of distributions
JEL Classification: G22, C45
Suggested Citation: Suggested Citation
Here is the Coronavirus
related research on SSRN
