Tensor Networks Meet Neural Networks: A Survey
Tensor networks (TNs) and neural networks (NNs) are two fundamental types of data modeling approaches. TNs have been proposed as a solution to the curse of dimensionality faced by large-scale tensors by converting an exponential number of dimensions to polynomial complexity. Thus, they have attracted many studies in the fields of quantum physics and machine learning. On the other hand, NNs are computing systems inspired by the biological NNs that constitute human brains. Recently, NNs and their variants have achieved outstanding performance in various applications, e.g., computer vision, natural language processing, and robotics research. Interestingly, although these two types of networks come from different observations, they are inextricably linked via the common intrinsic multilinearity structure underlying both TNs and NNs. Consequently, a significant number of intellectual sparks regarding combinations of TNs and NNs have burst out. The combinations described as “tensor networks meet neural networks” are termed tensorial neural networks (TNNs) in this paper. This survey introduces TNNs based on three aspects. 1) Network Compression. TNs can greatly reduce parameters in NNs and satisfy the idea of constructing effective NNs. 2) Information Fusion. TNs can naturally and effectively enhance NNs with their ability to model the interactions among multiple modalities, views, or sources of various data. 3) Quantum Circuit Simulation. TNs can assist in designing and simulating quantum neural networks (QNNs). This survey also investigates methods for improving TNNs, examines useful toolboxes for implementing TNNs, and attempts to document TNN development and highlight its potential future directions. To the best of our knowledge, this is the first comprehensive survey to bridge the connections among NNs, TNs, and quantum circuits. We provide a curated list of TNNs at https://github.com/tnbar/awesome-tensorial-neural-networks.