Deep Learning by a Unitary Tensor Network Algorithm Provides Hyperfast Financial Literacy
Posted: 6 Feb 2020 Last revised: 22 Apr 2022
Date Written: January 12, 2020
We show how tensor network theory and deep learning theory can be combined to provide a ground-state network (Orús, 2014) of financial information for hyperfast financial literacy. The resulting minimal-complexity structure encodes an infinite number of probable outcomes into a finite graphical alphabet of 12 potential dynamic relations, called double-entries; the pixels of financial information. Using the proposed many-layered (Serb & Prodromakis, 2019), financial wave function (Schrödinger, 1935), as a computational resource (Biamonte, 2016), allows hyperfast processing of financial statements, one pixel at a time; see fig. 1. This reveals a highly entangled architecture (Levine, et al., 2019), where complexity scales linearly, not exponentially (Huggins, et al., 2018). The new algorithm trains people on financial accounting in 10 hours; a process that would take at least one year with the conventional scheme. Results are based on solid empirical evidence.
Keywords: Tensor Networks, Deep Learning, Computational Complexity, Algorithms, Emergence, Information Reuse, Financial Literacy, Accounting
Suggested Citation: Suggested Citation