About Contribute Source | |
main/ml/ |
The use of tensor networks for machine learning is an emerging topic. One branch of research involves using a tensor network directly as machine learning model architecture. Another uses tensor networks to compress layers in neural network architectures or for other auxilary tasks.
Because tensor networks can be exactly mapped to quantum circuits, an exciting direction for tensor network machine learning is deploying and even training such models on quantum hardware.
Influential or ground breaking works on theory, algorithms, or applications of tensor networks for machine learning. Please help to expand this list by submitting a pull request to the tensornetwork.org Github repository.
Works are listed in reverse chronological order within each section.
Machine learning with tree tensor networks, CP rank constraints, and tensor dropout, Hao Chen, Thomas Barthel, arxiv:2305.19440
Uses tree tensor networks with a low-rank (low CP rank) representation of the individual tensos, plus other techniques such as dropout, to find supervised learning results with reduced computational cost and better generalization, on par with state-of-art.
Interaction Decompositions for Tensor Network Regression, Ian Convy, K. Birgitta Whaley, arxiv:2208.06029
Analyzes the contributions of polynomial terms of various degrees to the supervised learning performance of tensor network architectures.
From probabilistic graphical models to generalized tensor networks for supervised learning, Ivan Glasser, Nicola Pancotti, J. Ignacio Cirac, arxiv:1806.05964
Explores connections between tensor networks and graphical models. Proposes “generalized tensor network” architectures for machine learning, and explores learning of local feature maps.
Supervised Learning with Tensor Networks, E.M. Stoudenmire and David J. Schwab, Advances in Neural Information Processing Systems 29 (2016) [1]
Proposes the use of tensor networks for supervised machine learning, emphasizing connections to the quantum physics literature including related algorithms for optimization. State of the art results are demonstrated for the MNIST handwriting dataset.
Exponential Machines, Alexander Novikov, Mikhail Trofimov, Ivan Oseledets, arxiv:1605.03795 (ICLR 2017 workshop track paper)
Proposes the use of tensor networks for supervised machine learning with an elegant interpretation of the function represented by the model and a stochastic Riemannian optimization procedure. State-of-the-art performance is demonstrated on multiple datasets.
Learning multidimensional Fourier series with tensor trains, Sander Wahls, Visa Koivunen, H. Vincent Poor, Michel Verhaegen, 2014 IEEE Global Conference on Signal and Information Processing (GlobalSIP) p. 394–398 [2]
Combining Monte Carlo and Tensor-network Methods for Partial Differential Equations via Sketching, Yian Chen, Yuehaw Khoo, arxiv:2305.17884 (2023)
Generative modeling via Hierarchical Tensor Sketching, Y. Peng, Y. Chen, E. M. Stoudenmire, Y. Khoo, arxiv:2304.05305 (2023)
Quantum state tomography with tensor train cross approximation, Alexander Lidiak, Casey Jameson, Zhen Qin, Gongguo Tang, Michael B. Wakin, Zhihui Zhu, Zhexuan Gong, arxiv:2207.06397
Learning a quantum wavefunction from samples taken in different bases using the tensor train cross approximation algorithm
Generative modeling via tensor train sketching, Y. Hur, J. G. Hoskins, M. Lindsey, E.M. Stoudenmire, Y. Khoo, arxiv:2202.11788 (2022)
Deterministic algorithm for learning of probability distributions and other functions based on sequential, sketched matrix factorizations
Learning Feynman Diagrams with Tensor Trains, Yuriel Nunez-Fernandez, Matthieu Jeannin, Philipp T. Dumitrescu, Thomas Kloss, Jason Kaye, Olivier Parcollet, Xavier Waintal, arxiv:2207.06135
Learning of complicated functions occurring in series expansion methods (perturbation theory) used in quantum physics. After “actively” learning these high-dimensional functions using the cross approximation, they can be integrated very efficiently.
Generative modeling with projected entangled-pair states, Tom Vieijra, Laurens Vanderstraeten, Frank Verstraete, arxiv:2202.08177
Tensor networks for unsupervised machine learning, Jing Liu, Sujie Li, Jiang Zhang, Pan Zhang, arxiv:2106.12974
Explores the idea of representing a sequence of conditional probabilities as separate tensor networks to learn a full, or joint probability distribution.
Tree tensor networks for generative modeling, Song Cheng, Lei Wang, Tao Xiang, and Pan Zhang. Phys. Rev. B 99, 155131 (2019). [3]
Learning relevant features of data with multi-scale tensor networks, E.M. Stoudenmire, Quant. Sci. Tech. 3, 034003 (2018). [arxiv:1801.00315] [4]
Uses a recursive tree tensor network algorithm to determine a reduced set of features from data
Unsupervised Generative Modeling Using Matrix Product States, Zhao-Yu Han, Jun Wang, Heng Fan, Lei Wang, and Pan Zhang, Phys. Rev. X 8, 031012 (2018). [5]
Learning Generative Models for Active Inference using Tensor Networks, Samuel T. Wauthier, Bram Vanhecke, Tim Verbelen, Bart Dhoedt, arxiv:2208.08713
Self-Correcting Quantum Many-Body Control using Reinforcement Learning with Tensor Networks, Friederike Metz, Marin Bukov, arxiv:2201.11790 Takes advantage of the ability of tensor networks to represent quantum many-body wavefunctions to parameterize a reinforcement learning architecture for control of quantum systems.
Deterministic Tensor Network Classifiers, L. Wright, F. Barratt, J. Dborin, V. Wimalaweera, B. Coyle, A. G. Green, arxiv:2205.09768
Data compression for quantum machine learning, Rohit Dilip, Yu-Jie Liu, Adam Smith, Frank Pollmann, arxiv:2204.11170
Matrix Product State Pre-Training for Quantum Machine Learning, James Dborin, Fergus Barratt, Vinul Wimalaweera, Lewis Wright, Andrew G. Green, arxiv:2106.05742
Hierarchical quantum classifiers, Edward Grant, Marcello Benedetti, Shuxiang Cao, Andrew Hallam, Joshua Lockhart, Vid Stojevic, Andrew G. Green, Simone Severini, arxiv:1804.03680
Towards Quantum Machine Learning with Tensor Networks, William Huggins, Piyush Patel, K. Birgitta Whaley, E. Miles Stoudenmire, arxiv:1803.11537
Distributive Pre-training of Generative Modeling Using Matrix Product States, Sheng-Hsuan Lin, Olivier Kuijpers, Sebastian Peterhansl, Frank Pollmann
Summation of MPS representations of data as a pre-training or initialization technique
Generative modeling via tensor train sketching, Y. Hur, J. G. Hoskins, M. Lindsey, E.M. Stoudenmire, Y. Khoo, arxiv:2202.11788 (2022)
Deterministic algorithm for learning of probability distributions and other functions based on sequential, sketched matrix factorizations
Modeling sequences with quantum states: a look under the hood, Tai-Danae Bradley, E M Stoudenmire, and John Terilla, Machine Learning: Science and Technology, 1 035008 (2020) [6]
Deterministic algorithm for learning “Born rule” distributions based on forming and diagonalizing density matrices
A block-sparse Tensor Train Format for sample-efficient high-dimensional Polynomial Regression, Michael Goette, Reinhold Schneider, Philipp Trunschke, arxiv:2104.14255 [8]
Probabilistic Graphical Models and Tensor Networks: A Hybrid Framework, Jacob Miller, Geoffrey Roeder, Tai-Danae Bradley, arxiv:2106.15666 [9]
Lower and Upper Bounds on the VC-Dimension of Tensor Network Models, Behnoush Khavari and Guillaume Rabusseau, arxiv:2106.11827 [10]
Expressive power of tensor-network factorizations for probabilistic modeling, Ivan Glasser, Ryan Sweke, Nicola Pancotti, Jens Eisert, Ignacio Cirac, Advances in Neural Information Processing Systems 32 (2019) [11]
Rigorously explores and compares the expressive power of various tensor network formats, parameterizations, and formalisms for using them to model complicated functions.