About Contribute Source

Applications of Tensor Networks to Machine Learning

The use of tensor networks for machine learning is an emerging topic. One branch of research involves using a tensor network directly as machine learning model architecture. Another uses tensor networks to compress layers in neural network architectures or for other auxilary tasks.

Because tensor networks can be exactly mapped to quantum circuits, an exciting direction for tensor network machine learning is deploying and even training such models on quantum hardware.

Selected Works on Tensor Networks for Machine Learning

Influential or ground breaking works on theory, algorithms, or applications of tensor networks for machine learning. Please help to expand this list by submitting a pull request to the tensornetwork.org Github repository.

Works are listed in reverse chronological order within each section.

Supervised Learning and Regression

Unsupervised Learning / Generative Modeling

Reinforcement Learning

Implementation on Quantum Computers (Quantum Machine Learning)

Optimization Methods (Gradient Free)

Compression of Neural Networks

Mathematical or Theoretical Works

Language Modeling

References

  1. Supervised Learning with Tensor Networks, Edwin Stoudenmire, David J Schwab, Advances in Neural Information Processing Systems 29 (2016)
  2. Learning multidimensional fourier series with tensor trains, Sander Wahls, Visa Koivunen, H Vincent Poor, Michel Verhaegen, 2014 IEEE Global Conference on Signal and Information Processing (GlobalSIP), 394–398 (2014)
  3. Tree tensor networks for generative modeling, Song Cheng, Lei Wang, Tao Xiang, Pan Zhang, Phys. Rev. B 99, 155131 (2019)
  4. Learning relevant features of data with multi-scale tensor networks, E Miles Stoudenmire, Quantum Science and Technology 3, 034003
  5. Unsupervised Generative Modeling Using Matrix Product States, Zhao-Yu Han, Jun Wang, Heng Fan, Lei Wang, Pan Zhang, Phys. Rev. X 8, 031012 (2018)
  6. Modeling sequences with quantum states: a look under the hood, Tai-Danae Bradley, E M Stoudenmire, John Terilla, Machine Learning: Science and Technology 1, 035008 (2020)
  7. Tensorizing Neural Networks, Alexander Novikov, Dmitry Podoprikhin, Anton Osokin, Dmitry Vetrov, Advances in Neural Information Processing Systems 28 (2015)
  8. A block-sparse Tensor Train Format for sample-efficient high-dimensional Polynomial Regression, Michael Goette, Reinhold Schneider, Philipp Trunschke (2021), 2104.14255
  9. Probabilistic Graphical Models and Tensor Networks: A Hybrid Framework, Jacob Miller, Geoffrey Roeder, Tai-Danae Bradley (2021), 2106.15666
  10. Lower and Upper Bounds on the VC-Dimension of Tensor Network Models, Behnoush Khavari, Guillaume Rabusseau (2021), 2106.11827
  11. Advances in Neural Information Processing Systems, Ivan Glasser, Ryan Sweke, Nicola Pancotti, Jens Eisert, Ignacio Cirac (2019)

Edit This Page