Deep Learning by a Tensor Network Algorithm Provides Hyperfast Training of Business Finance
Posted: 6 Feb 2020 Last revised: 25 Mar 2020
Date Written: January 12, 2020
We show how tensor network theory and deep learning theory can be combined to provide a ground-state network (Orús, 2014) of financial information for hyperfast training of business finance. The resulting minimal-complexity structure encodes an infinite number of probable outcomes into a finite graphical alphabet of 12 potential dynamic relations, called double-entries; the pixels of financial information. Using the proposed many-layered (Serb & Prodromakis, 2019), financial wave function (Schrödinger, 1935), as a computational resource (Biamonte, 2016), allows hyperfast processing of financial information, one pixel at a time; see fig. 1. This reveals a highly entangled architecture (Levine, et al., 2019), where complexity scales linearly, not exponentially (Huggins, et al., 2018). The new algorithm trains people on the fundamentals of business finance in about 10 hours; a process that would take at least one year with the conventional scheme. Results are based on solid empirical evidence.
Keywords: Financial Literacy, Tensor Networks, Deep Learning, Complexity, Emergence, Information Reuse, Accounting
Suggested Citation: Suggested Citation