![]() |
About Contribute Source |
main/ml/ |
The use of tensor networks for machine learning is an emerging topic. One branch of research involves using a tensor network directly as machine learning model architecture. Another uses tensor networks to compress layers in neural network architectures or for other auxilary tasks.
Because tensor networks can be exactly mapped to quantum circuits, an exciting direction for tensor network machine learning is deploying and even training such models on quantum hardware.
Influential or ground breaking works on theory, algorithms, or applications of tensor networks for machine learning. Please help to expand this list by submitting a pull request to the tensornetwork.org Github repository.
Works are listed in reverse chronological order within each section.
From probabilistic graphical models to generalized tensor networks for supervised learning, Ivan Glasser, Nicola Pancotti, J. Ignacio Cirac, arxiv:1806.05964
Explores connections between tensor networks and graphical models. Proposes “generalized tensor network” architectures for machine learning, and explores learning of local feature maps.
Supervised Learning with Tensor Networks, E.M. Stoudenmire and David J. Schwab, Advances in Neural Information Processing Systems 29 (2016) [1]
Proposes the use of tensor networks for supervised machine learning, emphasizing connections to the quantum physics literature including related algorithms for optimization. State of the art results are demonstrated for the MNIST handwriting dataset.
Exponential Machines, Alexander Novikov, Mikhail Trofimov, Ivan Oseledets, arxiv:1605.03795 (ICLR 2017 workshop track paper)
Proposes the use of tensor networks for supervised machine learning with an elegant interpretation of the function represented by the model and a stochastic Riemannian optimization procedure. State-of-the-art performance is demonstrated on multiple datasets.
Tensor networks for unsupervised machine learning, Jing Liu, Sujie Li, Jiang Zhang, Pan Zhang, arxiv:2106.12974
Explores the idea of representing a sequence of conditional probabilities as separate tensor networks to learn a full, or joint probability distribution.
Tree tensor networks for generative modeling, Song Cheng, Lei Wang, Tao Xiang, and Pan Zhang. Phys. Rev. B 99, 155131 (2019). [2]
Unsupervised Generative Modeling Using Matrix Product States, Zhao-Yu Han, Jun Wang, Heng Fan, Lei Wang, and Pan Zhang, Phys. Rev. X 8, 031012 (2018). [3]
A block-sparse Tensor Train Format for sample-efficient high-dimensional Polynomial Regression, Michael Goette, Reinhold Schneider, Philipp Trunschke, arxiv:2104.14255 [5]
Probabilistic Graphical Models and Tensor Networks: A Hybrid Framework, Jacob Miller, Geoffrey Roeder, Tai-Danae Bradley, arxiv:2106.15666 [6]
Lower and Upper Bounds on the VC-Dimension of Tensor Network Models, Behnoush Khavari and Guillaume Rabusseau, arxiv:2106.11827 [7]
Expressive power of tensor-network factorizations for probabilistic modeling, Ivan Glasser, Ryan Sweke, Nicola Pancotti, Jens Eisert, Ignacio Cirac, Advances in Neural Information Processing Systems 32 (2019) [8]
Rigorously explores and compares the expressive power of various tensor network formats, parameterizations, and formalisms for using them to model complicated functions.