A Natural Representation of Functions that Facilitates `Exact Learning'
39 Pages Posted: 22 May 2020 Publication Status: Review CompleteMore...
We present a collection of mathematical tools and emphasise a fundamental representation of analytic functions. Connecting these concepts leads to a framework for `exact learning', where an unknown numeric distribution could in principle be assigned an exact mathematical description. This is a new perspective on machine learning with potential applications in all domains of the mathematical sciences and the generalised representations presented here have not yet been widely considered in the context ofmachine learning and data analysis. The moments of a multivariate function or distribution are extracted using a Mellin transform and the generalised form of the coefficients is trained assuming a highly generalised Mellin-Barnes integral representation. The fit functions use many fewer parameters contemporary machine learning methods and any implementation that connects these concepts successfully will likely carry across to non-exact problems and provide approximate solutions. We compare the equations for theexact learning method with those for a neural network which leads to a new perspective on understanding what a neural network may be learning and how to interpret the parameters of those networks.
Keywords: Exact Learning, Machine Learning, Mellin Transforms, Integral Transforms, Theoretical Methods, Experimental Mathematics, Mathematical Sciences
Suggested Citation: Suggested Citation