introduction to tensor networks for machine learning · introduction to tensor networks for machine...

105
Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity

Upload: others

Post on 25-Jun-2020

8 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Introduction to Tensor Networks for

Machine Learning

E.M. Stoudenmire Jun 2020 - CMCity

Page 2: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Machine learning galvanizing industry & science

Self-driving carsLanguage Processing

MedicineMaterials Science / Chemistry

Page 3: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Google rebranded a "machine learning first company"

Neural nets replace linguistic approach to Google Translate

Quantum machine learning

Page 4: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Examples of Machine Learning

Page 5: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Image recognition

ImageNet Classification with Deep ConvolutionalNeural Networks

Alex KrizhevskyUniversity of Toronto

[email protected]

Ilya SutskeverUniversity of Toronto

[email protected]

Geoffrey E. HintonUniversity of Toronto

[email protected]

Abstract

We trained a large, deep convolutional neural network to classify the 1.2 millionhigh-resolution images in the ImageNet LSVRC-2010 contest into the 1000 dif-ferent classes. On the test data, we achieved top-1 and top-5 error rates of 37.5%and 17.0% which is considerably better than the previous state-of-the-art. Theneural network, which has 60 million parameters and 650,000 neurons, consistsof five convolutional layers, some of which are followed by max-pooling layers,and three fully-connected layers with a final 1000-way softmax. To make train-ing faster, we used non-saturating neurons and a very efficient GPU implemen-tation of the convolution operation. To reduce overfitting in the fully-connectedlayers we employed a recently-developed regularization method called “dropout”that proved to be very effective. We also entered a variant of this model in theILSVRC-2012 competition and achieved a winning top-5 test error rate of 15.3%,compared to 26.2% achieved by the second-best entry.

1 Introduction

Current approaches to object recognition make essential use of machine learning methods. To im-prove their performance, we can collect larger datasets, learn more powerful models, and use bet-ter techniques for preventing overfitting. Until recently, datasets of labeled images were relativelysmall — on the order of tens of thousands of images (e.g., NORB [16], Caltech-101/256 [8, 9], andCIFAR-10/100 [12]). Simple recognition tasks can be solved quite well with datasets of this size,especially if they are augmented with label-preserving transformations. For example, the current-best error rate on the MNIST digit-recognition task (<0.3%) approaches human performance [4].But objects in realistic settings exhibit considerable variability, so to learn to recognize them it isnecessary to use much larger training sets. And indeed, the shortcomings of small image datasetshave been widely recognized (e.g., Pinto et al. [21]), but it has only recently become possible to col-lect labeled datasets with millions of images. The new larger datasets include LabelMe [23], whichconsists of hundreds of thousands of fully-segmented images, and ImageNet [6], which consists ofover 15 million labeled high-resolution images in over 22,000 categories.

To learn about thousands of objects from millions of images, we need a model with a large learningcapacity. However, the immense complexity of the object recognition task means that this prob-lem cannot be specified even by a dataset as large as ImageNet, so our model should also have lotsof prior knowledge to compensate for all the data we don’t have. Convolutional neural networks(CNNs) constitute one such class of models [16, 11, 13, 18, 15, 22, 26]. Their capacity can be con-trolled by varying their depth and breadth, and they also make strong and mostly correct assumptionsabout the nature of images (namely, stationarity of statistics and locality of pixel dependencies).Thus, compared to standard feedforward neural networks with similarly-sized layers, CNNs havemuch fewer connections and parameters and so they are easier to train, while their theoretically-bestperformance is likely to be only slightly worse.

1

2012 paper that launched recent deep learning era (20k citations)

Figure 4: (Left) Eight ILSVRC-2010 test images and the five labels considered most probable by our model.The correct label is written under each image, and the probability assigned to the correct label is also shownwith a red bar (if it happens to be in the top 5). (Right) Five ILSVRC-2010 test images in the first column. Theremaining columns show the six training images that produce feature vectors in the last hidden layer with thesmallest Euclidean distance from the feature vector for the test image.

In the left panel of Figure 4 we qualitatively assess what the network has learned by computing itstop-5 predictions on eight test images. Notice that even off-center objects, such as the mite in thetop-left, can be recognized by the net. Most of the top-5 labels appear reasonable. For example,only other types of cat are considered plausible labels for the leopard. In some cases (grille, cherry)there is genuine ambiguity about the intended focus of the photograph.

Another way to probe the network’s visual knowledge is to consider the feature activations inducedby an image at the last, 4096-dimensional hidden layer. If two images produce feature activationvectors with a small Euclidean separation, we can say that the higher levels of the neural networkconsider them to be similar. Figure 4 shows five images from the test set and the six images fromthe training set that are most similar to each of them according to this measure. Notice that at thepixel level, the retrieved training images are generally not close in L2 to the query images in the firstcolumn. For example, the retrieved dogs and elephants appear in a variety of poses. We present theresults for many more test images in the supplementary material.

Computing similarity by using Euclidean distance between two 4096-dimensional, real-valued vec-tors is inefficient, but it could be made efficient by training an auto-encoder to compress these vectorsto short binary codes. This should produce a much better image retrieval method than applying auto-encoders to the raw pixels [14], which does not make use of image labels and hence has a tendencyto retrieve images with similar patterns of edges, whether or not they are semantically similar.

7 Discussion

Our results show that a large, deep convolutional neural network is capable of achieving record-breaking results on a highly challenging dataset using purely supervised learning. It is notablethat our network’s performance degrades if a single convolutional layer is removed. For example,removing any of the middle layers results in a loss of about 2% for the top-1 performance of thenetwork. So the depth really is important for achieving our results.

To simplify our experiments, we did not use any unsupervised pre-training even though we expectthat it will help, especially if we obtain enough computational power to significantly increase thesize of the network without obtaining a corresponding increase in the amount of labeled data. Thusfar, our results have improved as we have made our network larger and trained it longer but we stillhave many orders of magnitude to go in order to match the infero-temporal pathway of the humanvisual system. Ultimately we would like to use very large and deep convolutional nets on videosequences where the temporal structure provides very helpful information that is missing or far lessobvious in static images.

8

ImageNet:

• 1.2 million training images (150k test)

• 1000 categories

• 15% neural net error

• 26% next best error

Page 6: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Sound prediction

Visually Indicated Sounds

Andrew Owens1 Phillip Isola2,1 Josh McDermott1

Antonio Torralba1 Edward H. Adelson1 William T. Freeman1,3

1MIT 2U.C. Berkeley 3Google Research

seconds0 1 2 3 4 5 6 7

Am

plit

ude

1/2

-0.5

0

0.5

seconds0 1 2 3 4 5 6 7

Am

plit

ude

1/2

-0.5

0

0.5

Pred

icte

d So

und

(Am

plitu

de1/

2 )

Time (seconds)

Inpu

t vid

eo

Figure 1: We train a model to synthesize plausible impact sounds from silent videos, a task that requires implicit knowledge of materialproperties and physical interactions. In each video, someone probes the scene with a drumstick, hitting and scratching different objects.We show frames from two videos and below them the predicted audio tracks. The locations of these sampled frames are indicated by thedotted lines on the audio track. The predicted audio tracks show seven seconds of sound, corresponding to multiple hits in the videos.

Abstract

Objects make distinctive sounds when they are hit orscratched. These sounds reveal aspects of an object’s ma-terial properties, as well as the actions that produced them.In this paper, we propose the task of predicting what soundan object makes when struck as a way of studying physicalinteractions within a visual scene. We present an algorithmthat synthesizes sound from silent videos of people hittingand scratching objects with a drumstick. This algorithmuses a recurrent neural network to predict sound featuresfrom videos and then produces a waveform from these fea-tures with an example-based synthesis procedure. We showthat the sounds predicted by our model are realistic enoughto fool participants in a “real or fake” psychophysical ex-periment, and that they convey significant information aboutmaterial properties and physical interactions.

1. Introduction

From the clink of a ceramic mug placed onto a saucer,to the squelch of a shoe pressed into mud, our days arefilled with visual experiences accompanied by predictablesounds. On many occasions, these sounds are not just statis-tically associated with the content of the images – the way,for example, that the sounds of unseen seagulls are associ-ated with a view of a beach – but instead are directly caused

by the physical interaction being depicted: you see what ismaking the sound.

We call these events visually indicated sounds, and wepropose the task of predicting sound from videos as a wayto study physical interactions within a visual scene (Fig-ure 1). To accurately predict a video’s held-out soundtrack,an algorithm has to know about the physical properties ofwhat it is seeing and the actions that are being performed.This task implicitly requires material recognition, but unliketraditional work on this problem [4, 38], we never explicitlytell the algorithm about materials. Instead, it learns aboutthem by identifying statistical regularities in the raw audio-visual signal.

We take inspiration from the way infants explore thephysical properties of a scene by poking and prodding atthe objects in front of them [36, 3], a process that may helpthem learn an intuitive theory of physics [3]. Recent worksuggests that the sounds objects make in response to theseinteractions may play a role in this process [39, 43].

We introduce a dataset that mimics this explorationprocess, containing hundreds of videos of people hitting,scratching, and prodding objects with a drumstick. To syn-thesize sound from these videos, we present an algorithmthat uses a recurrent neural network to map videos to audiofeatures. It then converts these audio features to a wave-form, either by matching them to exemplars in a database

1

arX

iv:1

512.

0851

2v2

[cs.C

V]

30 A

pr 2

016

http://vis.csail.mit.edu

Page 7: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Image Generation

Under review as a conference paper at ICLR 2016

Figure 7: Vector arithmetic for visual concepts. For each column, the Z vectors of samples areaveraged. Arithmetic was then performed on the mean vectors creating a new vector Y . The centersample on the right hand side is produce by feeding Y as input to the generator. To demonstratethe interpolation capabilities of the generator, uniform noise sampled with scale +-0.25 was addedto Y to produce the 8 other samples. Applying arithmetic in the input space (bottom two examples)results in noisy overlap due to misalignment.

Further work is needed to tackle this from of instability. We think that extending this framework

10

Under review as a conference paper at ICLR 2016

UNSUPERVISED REPRESENTATION LEARNINGWITH DEEP CONVOLUTIONALGENERATIVE ADVERSARIAL NETWORKS

Alec Radford & Luke Metzindico ResearchBoston, MA{alec,luke}@indico.io

Soumith ChintalaFacebook AI ResearchNew York, [email protected]

ABSTRACT

In recent years, supervised learning with convolutional networks (CNNs) hasseen huge adoption in computer vision applications. Comparatively, unsupervisedlearning with CNNs has received less attention. In this work we hope to helpbridge the gap between the success of CNNs for supervised learning and unsuper-vised learning. We introduce a class of CNNs called deep convolutional generativeadversarial networks (DCGANs), that have certain architectural constraints, anddemonstrate that they are a strong candidate for unsupervised learning. Trainingon various image datasets, we show convincing evidence that our deep convolu-tional adversarial pair learns a hierarchy of representations from object parts toscenes in both the generator and discriminator. Additionally, we use the learnedfeatures for novel tasks - demonstrating their applicability as general image repre-sentations.

1 INTRODUCTION

Learning reusable feature representations from large unlabeled datasets has been an area of activeresearch. In the context of computer vision, one can leverage the practically unlimited amount ofunlabeled images and videos to learn good intermediate representations, which can then be used ona variety of supervised learning tasks such as image classification. We propose that one way to buildgood image representations is by training Generative Adversarial Networks (GANs) (Goodfellowet al., 2014), and later reusing parts of the generator and discriminator networks as feature extractorsfor supervised tasks. GANs provide an attractive alternative to maximum likelihood techniques.One can additionally argue that their learning process and the lack of a heuristic cost function (suchas pixel-wise independent mean-square error) are attractive to representation learning. GANs havebeen known to be unstable to train, often resulting in generators that produce nonsensical outputs.There has been very limited published research in trying to understand and visualize what GANslearn, and the intermediate representations of multi-layer GANs.

In this paper, we make the following contributions

• We propose and evaluate a set of constraints on the architectural topology of ConvolutionalGANs that make them stable to train in most settings. We name this class of architecturesDeep Convolutional GANs (DCGAN)

• We use the trained discriminators for image classification tasks, showing competitive per-formance with other unsupervised algorithms.

• We visualize the filters learnt by GANs and empirically show that specific filters havelearned to draw specific objects.

1

arX

iv:1

511.

0643

4v2

[cs.L

G]

7 Ja

n 20

16

Page 8: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Success at tasks previously thought impossible

Page 9: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

What is machine learning?

Page 10: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

What is machine learning?

Data driven problem solving

Page 11: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

What is machine learning?

Any system that, given more data, performs increasingly better at some task

Data driven problem solving

Page 12: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

What is machine learning?

Framework / philosophy, not single method

Any system that, given more data, performs increasingly better at some task

Data driven problem solving

Page 13: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Software 1.0

Software 2.0

https://medium.com/@karpathy/software-2-0-a64152b37c35

Page 14: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Basics of Machine Learning

Page 15: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

28x28 grayscale

70000 labeled images

Example of a Dataset – Fashion MNIST

10 categories (labels)

Page 16: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Anatomy of a Dataset

training set test set

Page 17: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Anatomy of a Dataset

training set test set

validation set

Page 18: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Anatomy of a Dataset

training set test set

validation set 1

Page 19: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Anatomy of a Dataset

training set test set

validation set 2

Page 20: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Anatomy of a Dataset

training set test set

validation set 3

Page 21: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Anatomy of a Dataset

training set test set

validation set 4

Page 22: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Types of learning tasks:

Page 23: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Types of learning tasks:

• Supervised learning (labeled data)

Page 24: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Types of learning tasks:

• Supervised learning (labeled data)

• Unsupervised learning (unlabeled data)

Page 25: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Types of learning tasks:

• Supervised learning (labeled data)

• Unsupervised learning (unlabeled data)

• Reinforcement learning (must collect data)

Page 26: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Types of learning tasks:

• Supervised learning (labeled data)

• Unsupervised learning (unlabeled data)

• Reinforcement learning (must collect data)

a priori knowledge

high

low

Page 27: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Given labeled training data (labels and ) Find decision function

Supervised Learning

f(x)

f(x) > 0

f(x) < 0

x 2 A

x 2 B

A B

Example: identify photos of alligators and bears

Page 28: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

given training set , minimize cost function

Supervised Learning

C =1

NT

X

j

(f(xj)� yj)2 yj = { A

B

xj 2xj 2

+1

�1

{xj , yj}<latexit sha1_base64="RRgf7vmLs4jSqf4pUC/Lz2Q0jBc=">AAACzHicdVFNb9QwEPWGr7J8bUGcuFhESBzQKllVgt4q9cKNIrG00jqKJt7J1l3biWynZeXmyu+AK/wi/g1Ouoe2ux3J0tOb9/zGnqKWwrok+TeI7t1/8PDRzuPhk6fPnr8Y7b78bqvGcJzySlbmpACLUmicOuEkntQGQRUSj4vlYdc/PkdjRaW/uVWNmYKFFqXg4AKVj14zzxS406L0P9r87ANd5WeszUdxMk76opsgXYOYrOso3x38YvOKNwq14xKsnaVJ7TIPxgkusR2yxmINfAkLnAWoQaHNfD9/S98FZk7LyoSjHe3Z6w4PytqVKoKym9Xe7nXktt6sceWnzAtdNw41vwoqG0ldRbvPoHNhkDu5CgC4EWFWyk/BAHfhy4ZM4wWvlAI992yJrp2lmWeobWOwy/KXccoM6EV4YHtTXRjYUDPZS+P0cou6v36y1UCDI57QO5LgfHFXUjBec4Wdprc3uAmmk/H+OPm6Fx8crpe7Q96Qt+Q9SclHckA+kyMyJZx48pv8IX+jL1ET+ai9kkaDtecVuVHRz//AkeR8</latexit><latexit sha1_base64="RRgf7vmLs4jSqf4pUC/Lz2Q0jBc=">AAACzHicdVFNb9QwEPWGr7J8bUGcuFhESBzQKllVgt4q9cKNIrG00jqKJt7J1l3biWynZeXmyu+AK/wi/g1Ouoe2ux3J0tOb9/zGnqKWwrok+TeI7t1/8PDRzuPhk6fPnr8Y7b78bqvGcJzySlbmpACLUmicOuEkntQGQRUSj4vlYdc/PkdjRaW/uVWNmYKFFqXg4AKVj14zzxS406L0P9r87ANd5WeszUdxMk76opsgXYOYrOso3x38YvOKNwq14xKsnaVJ7TIPxgkusR2yxmINfAkLnAWoQaHNfD9/S98FZk7LyoSjHe3Z6w4PytqVKoKym9Xe7nXktt6sceWnzAtdNw41vwoqG0ldRbvPoHNhkDu5CgC4EWFWyk/BAHfhy4ZM4wWvlAI992yJrp2lmWeobWOwy/KXccoM6EV4YHtTXRjYUDPZS+P0cou6v36y1UCDI57QO5LgfHFXUjBec4Wdprc3uAmmk/H+OPm6Fx8crpe7Q96Qt+Q9SclHckA+kyMyJZx48pv8IX+jL1ET+ai9kkaDtecVuVHRz//AkeR8</latexit><latexit sha1_base64="RRgf7vmLs4jSqf4pUC/Lz2Q0jBc=">AAACzHicdVFNb9QwEPWGr7J8bUGcuFhESBzQKllVgt4q9cKNIrG00jqKJt7J1l3biWynZeXmyu+AK/wi/g1Ouoe2ux3J0tOb9/zGnqKWwrok+TeI7t1/8PDRzuPhk6fPnr8Y7b78bqvGcJzySlbmpACLUmicOuEkntQGQRUSj4vlYdc/PkdjRaW/uVWNmYKFFqXg4AKVj14zzxS406L0P9r87ANd5WeszUdxMk76opsgXYOYrOso3x38YvOKNwq14xKsnaVJ7TIPxgkusR2yxmINfAkLnAWoQaHNfD9/S98FZk7LyoSjHe3Z6w4PytqVKoKym9Xe7nXktt6sceWnzAtdNw41vwoqG0ldRbvPoHNhkDu5CgC4EWFWyk/BAHfhy4ZM4wWvlAI992yJrp2lmWeobWOwy/KXccoM6EV4YHtTXRjYUDPZS+P0cou6v36y1UCDI57QO5LgfHFXUjBec4Wdprc3uAmmk/H+OPm6Fx8crpe7Q96Qt+Q9SclHckA+kyMyJZx48pv8IX+jL1ET+ai9kkaDtecVuVHRz//AkeR8</latexit>

Typical strategy:

Cost function measures distance of trial function from idealized "indicator" function

by varying adjustable params of

f(xj)<latexit sha1_base64="Y+DNc5MoedKWBq9IM++waL2aejE=">AAACxnicdVHLattAFB2rj6TuI06z7GaoKKQbI5lCk10gm3aXQN0ELGGuxlfO1PMQ80hiFEH/o5su2o/q33SkeJHEzoWBw7nnzLkzt6gEty5J/vWiJ0+fPd/aftF/+er1m53B7tvvVnvDcMy00Oa8AIuCKxw77gSeVwZBFgLPisVx2z+7RGO5Vt/cssJcwlzxkjNwgZoOdsr9TIK7KMr6upn++DgdxMkw6Yqug3QFYrKqk+lu73c208xLVI4JsHaSJpXLazCOM4FNP/MWK2ALmOMkQAUSbV53kzf0Q2BmtNQmHOVox9511CCtXcoiKNsp7cNeS27qTbwrD/Kaq8o7VOw2qPSCOk3bb6AzbpA5sQwAmOFhVsouwABz4bP6mcIrpqUENauzBbpmkuZ1hsp6g21WfROnmQE1Dw9s7qsLA2vqTHTSOL3ZoO6uH2000OCIR/SRJLicP5YUjHdcYafpww2ug/FoeDhMTj/FR8er5W6Td+Q92Scp+UyOyBdyQsaEEU9+kT/kb/Q10pGPrm6lUW/l2SP3Kvr5H9KG4bk=</latexit><latexit sha1_base64="Y+DNc5MoedKWBq9IM++waL2aejE=">AAACxnicdVHLattAFB2rj6TuI06z7GaoKKQbI5lCk10gm3aXQN0ELGGuxlfO1PMQ80hiFEH/o5su2o/q33SkeJHEzoWBw7nnzLkzt6gEty5J/vWiJ0+fPd/aftF/+er1m53B7tvvVnvDcMy00Oa8AIuCKxw77gSeVwZBFgLPisVx2z+7RGO5Vt/cssJcwlzxkjNwgZoOdsr9TIK7KMr6upn++DgdxMkw6Yqug3QFYrKqk+lu73c208xLVI4JsHaSJpXLazCOM4FNP/MWK2ALmOMkQAUSbV53kzf0Q2BmtNQmHOVox9511CCtXcoiKNsp7cNeS27qTbwrD/Kaq8o7VOw2qPSCOk3bb6AzbpA5sQwAmOFhVsouwABz4bP6mcIrpqUENauzBbpmkuZ1hsp6g21WfROnmQE1Dw9s7qsLA2vqTHTSOL3ZoO6uH2000OCIR/SRJLicP5YUjHdcYafpww2ug/FoeDhMTj/FR8er5W6Td+Q92Scp+UyOyBdyQsaEEU9+kT/kb/Q10pGPrm6lUW/l2SP3Kvr5H9KG4bk=</latexit><latexit sha1_base64="Y+DNc5MoedKWBq9IM++waL2aejE=">AAACxnicdVHLattAFB2rj6TuI06z7GaoKKQbI5lCk10gm3aXQN0ELGGuxlfO1PMQ80hiFEH/o5su2o/q33SkeJHEzoWBw7nnzLkzt6gEty5J/vWiJ0+fPd/aftF/+er1m53B7tvvVnvDcMy00Oa8AIuCKxw77gSeVwZBFgLPisVx2z+7RGO5Vt/cssJcwlzxkjNwgZoOdsr9TIK7KMr6upn++DgdxMkw6Yqug3QFYrKqk+lu73c208xLVI4JsHaSJpXLazCOM4FNP/MWK2ALmOMkQAUSbV53kzf0Q2BmtNQmHOVox9511CCtXcoiKNsp7cNeS27qTbwrD/Kaq8o7VOw2qPSCOk3bb6AzbpA5sQwAmOFhVsouwABz4bP6mcIrpqUENauzBbpmkuZ1hsp6g21WfROnmQE1Dw9s7qsLA2vqTHTSOL3ZoO6uH2000OCIR/SRJLicP5YUjHdcYafpww2ug/FoeDhMTj/FR8er5W6Td+Q92Scp+UyOyBdyQsaEEU9+kT/kb/Q10pGPrm6lUW/l2SP3Kvr5H9KG4bk=</latexit>

yj<latexit sha1_base64="NKyrrrnqL0EdDS2w8ZEg1nVW5yc=">AAACuHicdVFNSwMxEE3X7/qtRy/BRfBUdoug4kXw4lHRqtBdymw6XWOT7JJkK2XtT/CqV/+W/8Z07UFtHQg83ryXN8kkueDGBsFnzZubX1hcWl6pr66tb2xube/cmazQDFssE5l+SMCg4ApblluBD7lGkInA+6R/Me7fD1AbnqlbO8wxlpAq3uMMrKNuhp2nzpYfNIKq6DQIJ8Ank7rqbNc+om7GConKMgHGtMMgt3EJ2nImcFSPCoM5sD6k2HZQgUQTl9WsI3rgmC7tZdodZWnF/nSUII0ZysQpJdhH87c3Jmf12oXtncQlV3lhUbHvoF4hqM3o+OG0yzUyK4YOANPczUrZI2hg1n1PPVL4zDIpQXXLqI921A7jMkJlCo3jrPLFDyMNKnUPHP1WJxqm1JGopH74MkNdXd+caaDO4TfpP0kwSP9LcsYfLrfT8O8Gp0Gr2ThtBNdH/vnFZLnLZI/sk0MSkmNyTi7JFWkRRlLySt7Iu3fmgZd6/Fvq1SaeXfKrPP0F7fjclA==</latexit><latexit sha1_base64="NKyrrrnqL0EdDS2w8ZEg1nVW5yc=">AAACuHicdVFNSwMxEE3X7/qtRy/BRfBUdoug4kXw4lHRqtBdymw6XWOT7JJkK2XtT/CqV/+W/8Z07UFtHQg83ryXN8kkueDGBsFnzZubX1hcWl6pr66tb2xube/cmazQDFssE5l+SMCg4ApblluBD7lGkInA+6R/Me7fD1AbnqlbO8wxlpAq3uMMrKNuhp2nzpYfNIKq6DQIJ8Ank7rqbNc+om7GConKMgHGtMMgt3EJ2nImcFSPCoM5sD6k2HZQgUQTl9WsI3rgmC7tZdodZWnF/nSUII0ZysQpJdhH87c3Jmf12oXtncQlV3lhUbHvoF4hqM3o+OG0yzUyK4YOANPczUrZI2hg1n1PPVL4zDIpQXXLqI921A7jMkJlCo3jrPLFDyMNKnUPHP1WJxqm1JGopH74MkNdXd+caaDO4TfpP0kwSP9LcsYfLrfT8O8Gp0Gr2ThtBNdH/vnFZLnLZI/sk0MSkmNyTi7JFWkRRlLySt7Iu3fmgZd6/Fvq1SaeXfKrPP0F7fjclA==</latexit><latexit sha1_base64="NKyrrrnqL0EdDS2w8ZEg1nVW5yc=">AAACuHicdVFNSwMxEE3X7/qtRy/BRfBUdoug4kXw4lHRqtBdymw6XWOT7JJkK2XtT/CqV/+W/8Z07UFtHQg83ryXN8kkueDGBsFnzZubX1hcWl6pr66tb2xube/cmazQDFssE5l+SMCg4ApblluBD7lGkInA+6R/Me7fD1AbnqlbO8wxlpAq3uMMrKNuhp2nzpYfNIKq6DQIJ8Ank7rqbNc+om7GConKMgHGtMMgt3EJ2nImcFSPCoM5sD6k2HZQgUQTl9WsI3rgmC7tZdodZWnF/nSUII0ZysQpJdhH87c3Jmf12oXtncQlV3lhUbHvoF4hqM3o+OG0yzUyK4YOANPczUrZI2hg1n1PPVL4zDIpQXXLqI921A7jMkJlCo3jrPLFDyMNKnUPHP1WJxqm1JGopH74MkNdXd+caaDO4TfpP0kwSP9LcsYfLrfT8O8Gp0Gr2ThtBNdH/vnFZLnLZI/sk0MSkmNyTi7JFWkRRlLySt7Iu3fmgZd6/Fvq1SaeXfKrPP0F7fjclA==</latexit>

f<latexit sha1_base64="GFQo3dgm8y5gwP32Oaf4rGpg0SE=">AAACtnicdVFNSwMxEE3Xr1q/9egluAieym4R1JvgxWMFq0J3KbPp7BqaZJckq5S1v8Cr3v1b/hvTtQe1dSDwePNe3iSTFIIbGwSfDW9peWV1rbne2tjc2t7Z3du/M3mpGfZYLnL9kIBBwRX2LLcCHwqNIBOB98noatq/f0JteK5u7bjAWEKmeMoZWEfdpINdP2gHddF5EM6AT2bVHew1PqJhzkqJyjIBxvTDoLBxBdpyJnDSikqDBbARZNh3UIFEE1f1pBN67JghTXPtjrK0Zn86KpDGjGXilBLso/nbm5KLev3SpudxxVVRWlTsOygtBbU5nT6bDrlGZsXYAWCau1kpewQNzLrPaUUKn1kuJahhFY3QTvphXEWoTKlxmlW9+GGkQWXugZPf6kTDnDoStdQPXxao6+s7Cw3UOfwO/ScJnrL/kpzxh8vtNPy7wXnQ67Qv2sHNqX95NVtukxySI3JCQnJGLsk16ZIeYQTJK3kj796FN/DQy76lXmPmOSC/yiu+AKKr26Q=</latexit><latexit sha1_base64="GFQo3dgm8y5gwP32Oaf4rGpg0SE=">AAACtnicdVFNSwMxEE3Xr1q/9egluAieym4R1JvgxWMFq0J3KbPp7BqaZJckq5S1v8Cr3v1b/hvTtQe1dSDwePNe3iSTFIIbGwSfDW9peWV1rbne2tjc2t7Z3du/M3mpGfZYLnL9kIBBwRX2LLcCHwqNIBOB98noatq/f0JteK5u7bjAWEKmeMoZWEfdpINdP2gHddF5EM6AT2bVHew1PqJhzkqJyjIBxvTDoLBxBdpyJnDSikqDBbARZNh3UIFEE1f1pBN67JghTXPtjrK0Zn86KpDGjGXilBLso/nbm5KLev3SpudxxVVRWlTsOygtBbU5nT6bDrlGZsXYAWCau1kpewQNzLrPaUUKn1kuJahhFY3QTvphXEWoTKlxmlW9+GGkQWXugZPf6kTDnDoStdQPXxao6+s7Cw3UOfwO/ScJnrL/kpzxh8vtNPy7wXnQ67Qv2sHNqX95NVtukxySI3JCQnJGLsk16ZIeYQTJK3kj796FN/DQy76lXmPmOSC/yiu+AKKr26Q=</latexit><latexit sha1_base64="GFQo3dgm8y5gwP32Oaf4rGpg0SE=">AAACtnicdVFNSwMxEE3Xr1q/9egluAieym4R1JvgxWMFq0J3KbPp7BqaZJckq5S1v8Cr3v1b/hvTtQe1dSDwePNe3iSTFIIbGwSfDW9peWV1rbne2tjc2t7Z3du/M3mpGfZYLnL9kIBBwRX2LLcCHwqNIBOB98noatq/f0JteK5u7bjAWEKmeMoZWEfdpINdP2gHddF5EM6AT2bVHew1PqJhzkqJyjIBxvTDoLBxBdpyJnDSikqDBbARZNh3UIFEE1f1pBN67JghTXPtjrK0Zn86KpDGjGXilBLso/nbm5KLev3SpudxxVVRWlTsOygtBbU5nT6bDrlGZsXYAWCau1kpewQNzLrPaUUKn1kuJahhFY3QTvphXEWoTKlxmlW9+GGkQWXugZPf6kTDnDoStdQPXxao6+s7Cw3UOfwO/ScJnrL/kpzxh8vtNPy7wXnQ67Qv2sHNqX95NVtukxySI3JCQnJGLsk16ZIeYQTJK3kj796FN/DQy76lXmPmOSC/yiu+AKKr26Q=</latexit>

Page 29: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Given unlabeled training data

Unsupervised Learning

{xj}

Page 30: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

•Find function such that

Given unlabeled training data

Unsupervised Learning

{xj}

f(x) f(xj) ' p(xj)

Page 31: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

•Find function such that

•Find function such that

Given unlabeled training data

Unsupervised Learning

{xj}

f(x) f(xj) ' p(xj)

f(x) |f(xj)|2 ' p(xj)

Page 32: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

•Find function such that

•Find function such that

•Find data clusters and which data belongs to each cluster

Given unlabeled training data

Unsupervised Learning

{xj}

f(x) f(xj) ' p(xj)

f(x) |f(xj)|2 ' p(xj)

Page 33: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

•Find function such that

•Find function such that

•Find data clusters and which data belongs to each cluster

•Discover reduced representations of data for other learning tasks (e.g. supervised)

Given unlabeled training data

Unsupervised Learning

{xj}

f(x) f(xj) ' p(xj)

f(x) |f(xj)|2 ' p(xj)

Page 34: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

General Philosophy of Machine Learning

Page 35: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

General Philosophy of Machine Learning

• Solution to problem just some function y(x)<latexit sha1_base64="PWVfXEs9I33kpkGBaf87D6RMWmQ=">AAACwnicdVFdT9swFHXD2FgZ4+uRF2sREnupkmrS2BsSmsQjSHRFaiJ0494Uq7YTbAeo0vyOvcLP4t/ghD7QrytZOjr3HJ9r3yQX3NggeG15G582P3/Z+tre/rbzfXdv/+CfyQrNsMcykembBAwKrrBnuRV4k2sEmQjsJ+Pzut9/QG14pq7tJMdYwkjxlDOwjoonJ5EEe5ek5VP183bPDzpBU3QZhDPgk1ld3u63nqNhxgqJyjIBxgzCILdxCdpyJrBqR4XBHNgYRjhwUIFEE5fN1BU9dsyQppl2R1nasB8dJUhjJjJxynpGs9iryVW9QWHT07jkKi8sKvYelBaC2ozWX0CHXCOzYuIAMM3drJTdgQZm3Ue1I4WPLJMS1LCMxmirQRiXESpTaKyzyqkfRhrUyD2wmlcnGpbUkWikfjhdoW6u7640UOfwu3RNEjyM1iU54weX22m4uMFl0Ot2/nSCq1/+2flsuVvkiPwgJyQkv8kZuSCXpEcYuSf/yTN58f56Y+/eM+9SrzXzHJK58qZvRgPgvg==</latexit><latexit sha1_base64="PWVfXEs9I33kpkGBaf87D6RMWmQ=">AAACwnicdVFdT9swFHXD2FgZ4+uRF2sREnupkmrS2BsSmsQjSHRFaiJ0494Uq7YTbAeo0vyOvcLP4t/ghD7QrytZOjr3HJ9r3yQX3NggeG15G582P3/Z+tre/rbzfXdv/+CfyQrNsMcykembBAwKrrBnuRV4k2sEmQjsJ+Pzut9/QG14pq7tJMdYwkjxlDOwjoonJ5EEe5ek5VP183bPDzpBU3QZhDPgk1ld3u63nqNhxgqJyjIBxgzCILdxCdpyJrBqR4XBHNgYRjhwUIFEE5fN1BU9dsyQppl2R1nasB8dJUhjJjJxynpGs9iryVW9QWHT07jkKi8sKvYelBaC2ozWX0CHXCOzYuIAMM3drJTdgQZm3Ue1I4WPLJMS1LCMxmirQRiXESpTaKyzyqkfRhrUyD2wmlcnGpbUkWikfjhdoW6u7640UOfwu3RNEjyM1iU54weX22m4uMFl0Ot2/nSCq1/+2flsuVvkiPwgJyQkv8kZuSCXpEcYuSf/yTN58f56Y+/eM+9SrzXzHJK58qZvRgPgvg==</latexit><latexit sha1_base64="PWVfXEs9I33kpkGBaf87D6RMWmQ=">AAACwnicdVFdT9swFHXD2FgZ4+uRF2sREnupkmrS2BsSmsQjSHRFaiJ0494Uq7YTbAeo0vyOvcLP4t/ghD7QrytZOjr3HJ9r3yQX3NggeG15G582P3/Z+tre/rbzfXdv/+CfyQrNsMcykembBAwKrrBnuRV4k2sEmQjsJ+Pzut9/QG14pq7tJMdYwkjxlDOwjoonJ5EEe5ek5VP183bPDzpBU3QZhDPgk1ld3u63nqNhxgqJyjIBxgzCILdxCdpyJrBqR4XBHNgYRjhwUIFEE5fN1BU9dsyQppl2R1nasB8dJUhjJjJxynpGs9iryVW9QWHT07jkKi8sKvYelBaC2ozWX0CHXCOzYuIAMM3drJTdgQZm3Ue1I4WPLJMS1LCMxmirQRiXESpTaKyzyqkfRhrUyD2wmlcnGpbUkWikfjhdoW6u7640UOfwu3RNEjyM1iU54weX22m4uMFl0Ot2/nSCq1/+2flsuVvkiPwgJyQkv8kZuSCXpEcYuSf/yTN58f56Y+/eM+9SrzXzHJK58qZvRgPgvg==</latexit>

Page 36: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

General Philosophy of Machine Learning

• Solution to problem just some function

• Parameterize very flexible functions (prefer convenient over "correct")

y(x)<latexit sha1_base64="PWVfXEs9I33kpkGBaf87D6RMWmQ=">AAACwnicdVFdT9swFHXD2FgZ4+uRF2sREnupkmrS2BsSmsQjSHRFaiJ0494Uq7YTbAeo0vyOvcLP4t/ghD7QrytZOjr3HJ9r3yQX3NggeG15G582P3/Z+tre/rbzfXdv/+CfyQrNsMcykembBAwKrrBnuRV4k2sEmQjsJ+Pzut9/QG14pq7tJMdYwkjxlDOwjoonJ5EEe5ek5VP183bPDzpBU3QZhDPgk1ld3u63nqNhxgqJyjIBxgzCILdxCdpyJrBqR4XBHNgYRjhwUIFEE5fN1BU9dsyQppl2R1nasB8dJUhjJjJxynpGs9iryVW9QWHT07jkKi8sKvYelBaC2ozWX0CHXCOzYuIAMM3drJTdgQZm3Ue1I4WPLJMS1LCMxmirQRiXESpTaKyzyqkfRhrUyD2wmlcnGpbUkWikfjhdoW6u7640UOfwu3RNEjyM1iU54weX22m4uMFl0Ot2/nSCq1/+2flsuVvkiPwgJyQkv8kZuSCXpEcYuSf/yTN58f56Y+/eM+9SrzXzHJK58qZvRgPgvg==</latexit><latexit sha1_base64="PWVfXEs9I33kpkGBaf87D6RMWmQ=">AAACwnicdVFdT9swFHXD2FgZ4+uRF2sREnupkmrS2BsSmsQjSHRFaiJ0494Uq7YTbAeo0vyOvcLP4t/ghD7QrytZOjr3HJ9r3yQX3NggeG15G582P3/Z+tre/rbzfXdv/+CfyQrNsMcykembBAwKrrBnuRV4k2sEmQjsJ+Pzut9/QG14pq7tJMdYwkjxlDOwjoonJ5EEe5ek5VP183bPDzpBU3QZhDPgk1ld3u63nqNhxgqJyjIBxgzCILdxCdpyJrBqR4XBHNgYRjhwUIFEE5fN1BU9dsyQppl2R1nasB8dJUhjJjJxynpGs9iryVW9QWHT07jkKi8sKvYelBaC2ozWX0CHXCOzYuIAMM3drJTdgQZm3Ue1I4WPLJMS1LCMxmirQRiXESpTaKyzyqkfRhrUyD2wmlcnGpbUkWikfjhdoW6u7640UOfwu3RNEjyM1iU54weX22m4uMFl0Ot2/nSCq1/+2flsuVvkiPwgJyQkv8kZuSCXpEcYuSf/yTN58f56Y+/eM+9SrzXzHJK58qZvRgPgvg==</latexit><latexit sha1_base64="PWVfXEs9I33kpkGBaf87D6RMWmQ=">AAACwnicdVFdT9swFHXD2FgZ4+uRF2sREnupkmrS2BsSmsQjSHRFaiJ0494Uq7YTbAeo0vyOvcLP4t/ghD7QrytZOjr3HJ9r3yQX3NggeG15G582P3/Z+tre/rbzfXdv/+CfyQrNsMcykembBAwKrrBnuRV4k2sEmQjsJ+Pzut9/QG14pq7tJMdYwkjxlDOwjoonJ5EEe5ek5VP183bPDzpBU3QZhDPgk1ld3u63nqNhxgqJyjIBxgzCILdxCdpyJrBqR4XBHNgYRjhwUIFEE5fN1BU9dsyQppl2R1nasB8dJUhjJjJxynpGs9iryVW9QWHT07jkKi8sKvYelBaC2ozWX0CHXCOzYuIAMM3drJTdgQZm3Ue1I4WPLJMS1LCMxmirQRiXESpTaKyzyqkfRhrUyD2wmlcnGpbUkWikfjhdoW6u7640UOfwu3RNEjyM1iU54weX22m4uMFl0Ot2/nSCq1/+2flsuVvkiPwgJyQkv8kZuSCXpEcYuSf/yTN58f56Y+/eM+9SrzXzHJK58qZvRgPgvg==</latexit>

f(x)<latexit sha1_base64="oXbWpaeDseZaVofi36pRPcN46Dk=">AAACwnicdVFdT9swFHXDBqxjDMbjXqxFk+ClSiokxhsSQtojSOuK1EToxr0pVm0ntR2gSvM7eIWftX8zJ/SBfl3J0tG55/hc+ya54MYGwb+Wt/Xh4/bO7qf2570v+18PDr/9NVmhGfZYJjJ9m4BBwRX2LLcCb3ONIBOB/WR8Wff7D6gNz9QfO80xljBSPOUMrKPi9DiSYO+TtHyqTu4O/KATNEVXQTgHPpnX9d1h6yUaZqyQqCwTYMwgDHIbl6AtZwKrdlQYzIGNYYQDBxVINHHZTF3Rn44Z0jTT7ihLG/a9owRpzFQmTlnPaJZ7NbmuNyhs+isuucoLi4q9BaWFoDaj9RfQIdfIrJg6AExzNytl96CBWfdR7UjhI8ukBDUsozHaahDGZYTKFBrrrHLmh5EGNXIPrBbViYYVdSQaqR/O1qib67trDdQ5/C7dkAQPo01JzvjO5XYaLm9wFfS6nfNOcHPqX1zOl7tLvpMf5JiE5IxckN/kmvQIIxPyTF7Iq3fljb2JZ96kXmvuOSIL5c3+AxkJ4Ks=</latexit><latexit sha1_base64="oXbWpaeDseZaVofi36pRPcN46Dk=">AAACwnicdVFdT9swFHXDBqxjDMbjXqxFk+ClSiokxhsSQtojSOuK1EToxr0pVm0ntR2gSvM7eIWftX8zJ/SBfl3J0tG55/hc+ya54MYGwb+Wt/Xh4/bO7qf2570v+18PDr/9NVmhGfZYJjJ9m4BBwRX2LLcCb3ONIBOB/WR8Wff7D6gNz9QfO80xljBSPOUMrKPi9DiSYO+TtHyqTu4O/KATNEVXQTgHPpnX9d1h6yUaZqyQqCwTYMwgDHIbl6AtZwKrdlQYzIGNYYQDBxVINHHZTF3Rn44Z0jTT7ihLG/a9owRpzFQmTlnPaJZ7NbmuNyhs+isuucoLi4q9BaWFoDaj9RfQIdfIrJg6AExzNytl96CBWfdR7UjhI8ukBDUsozHaahDGZYTKFBrrrHLmh5EGNXIPrBbViYYVdSQaqR/O1qib67trDdQ5/C7dkAQPo01JzvjO5XYaLm9wFfS6nfNOcHPqX1zOl7tLvpMf5JiE5IxckN/kmvQIIxPyTF7Iq3fljb2JZ96kXmvuOSIL5c3+AxkJ4Ks=</latexit><latexit sha1_base64="oXbWpaeDseZaVofi36pRPcN46Dk=">AAACwnicdVFdT9swFHXDBqxjDMbjXqxFk+ClSiokxhsSQtojSOuK1EToxr0pVm0ntR2gSvM7eIWftX8zJ/SBfl3J0tG55/hc+ya54MYGwb+Wt/Xh4/bO7qf2570v+18PDr/9NVmhGfZYJjJ9m4BBwRX2LLcCb3ONIBOB/WR8Wff7D6gNz9QfO80xljBSPOUMrKPi9DiSYO+TtHyqTu4O/KATNEVXQTgHPpnX9d1h6yUaZqyQqCwTYMwgDHIbl6AtZwKrdlQYzIGNYYQDBxVINHHZTF3Rn44Z0jTT7ihLG/a9owRpzFQmTlnPaJZ7NbmuNyhs+isuucoLi4q9BaWFoDaj9RfQIdfIrJg6AExzNytl96CBWfdR7UjhI8ukBDUsozHaahDGZYTKFBrrrHLmh5EGNXIPrBbViYYVdSQaqR/O1qib67trDdQ5/C7dkAQPo01JzvjO5XYaLm9wFfS6nfNOcHPqX1zOl7tLvpMf5JiE5IxckN/kmvQIIxPyTF7Iq3fljb2JZ96kXmvuOSIL5c3+AxkJ4Ks=</latexit>

Page 37: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

General Philosophy of Machine Learning

• Solution to problem just some function

• Parameterize very flexible functions (prefer convenient over "correct")

• Of all that come closest to for training data, prefer the simplest

y(x)<latexit sha1_base64="PWVfXEs9I33kpkGBaf87D6RMWmQ=">AAACwnicdVFdT9swFHXD2FgZ4+uRF2sREnupkmrS2BsSmsQjSHRFaiJ0494Uq7YTbAeo0vyOvcLP4t/ghD7QrytZOjr3HJ9r3yQX3NggeG15G582P3/Z+tre/rbzfXdv/+CfyQrNsMcykembBAwKrrBnuRV4k2sEmQjsJ+Pzut9/QG14pq7tJMdYwkjxlDOwjoonJ5EEe5ek5VP183bPDzpBU3QZhDPgk1ld3u63nqNhxgqJyjIBxgzCILdxCdpyJrBqR4XBHNgYRjhwUIFEE5fN1BU9dsyQppl2R1nasB8dJUhjJjJxynpGs9iryVW9QWHT07jkKi8sKvYelBaC2ozWX0CHXCOzYuIAMM3drJTdgQZm3Ue1I4WPLJMS1LCMxmirQRiXESpTaKyzyqkfRhrUyD2wmlcnGpbUkWikfjhdoW6u7640UOfwu3RNEjyM1iU54weX22m4uMFl0Ot2/nSCq1/+2flsuVvkiPwgJyQkv8kZuSCXpEcYuSf/yTN58f56Y+/eM+9SrzXzHJK58qZvRgPgvg==</latexit><latexit sha1_base64="PWVfXEs9I33kpkGBaf87D6RMWmQ=">AAACwnicdVFdT9swFHXD2FgZ4+uRF2sREnupkmrS2BsSmsQjSHRFaiJ0494Uq7YTbAeo0vyOvcLP4t/ghD7QrytZOjr3HJ9r3yQX3NggeG15G582P3/Z+tre/rbzfXdv/+CfyQrNsMcykembBAwKrrBnuRV4k2sEmQjsJ+Pzut9/QG14pq7tJMdYwkjxlDOwjoonJ5EEe5ek5VP183bPDzpBU3QZhDPgk1ld3u63nqNhxgqJyjIBxgzCILdxCdpyJrBqR4XBHNgYRjhwUIFEE5fN1BU9dsyQppl2R1nasB8dJUhjJjJxynpGs9iryVW9QWHT07jkKi8sKvYelBaC2ozWX0CHXCOzYuIAMM3drJTdgQZm3Ue1I4WPLJMS1LCMxmirQRiXESpTaKyzyqkfRhrUyD2wmlcnGpbUkWikfjhdoW6u7640UOfwu3RNEjyM1iU54weX22m4uMFl0Ot2/nSCq1/+2flsuVvkiPwgJyQkv8kZuSCXpEcYuSf/yTN58f56Y+/eM+9SrzXzHJK58qZvRgPgvg==</latexit><latexit sha1_base64="PWVfXEs9I33kpkGBaf87D6RMWmQ=">AAACwnicdVFdT9swFHXD2FgZ4+uRF2sREnupkmrS2BsSmsQjSHRFaiJ0494Uq7YTbAeo0vyOvcLP4t/ghD7QrytZOjr3HJ9r3yQX3NggeG15G582P3/Z+tre/rbzfXdv/+CfyQrNsMcykembBAwKrrBnuRV4k2sEmQjsJ+Pzut9/QG14pq7tJMdYwkjxlDOwjoonJ5EEe5ek5VP183bPDzpBU3QZhDPgk1ld3u63nqNhxgqJyjIBxgzCILdxCdpyJrBqR4XBHNgYRjhwUIFEE5fN1BU9dsyQppl2R1nasB8dJUhjJjJxynpGs9iryVW9QWHT07jkKi8sKvYelBaC2ozWX0CHXCOzYuIAMM3drJTdgQZm3Ue1I4WPLJMS1LCMxmirQRiXESpTaKyzyqkfRhrUyD2wmlcnGpbUkWikfjhdoW6u7640UOfwu3RNEjyM1iU54weX22m4uMFl0Ot2/nSCq1/+2flsuVvkiPwgJyQkv8kZuSCXpEcYuSf/yTN58f56Y+/eM+9SrzXzHJK58qZvRgPgvg==</latexit>

f(x)<latexit sha1_base64="oXbWpaeDseZaVofi36pRPcN46Dk=">AAACwnicdVFdT9swFHXDBqxjDMbjXqxFk+ClSiokxhsSQtojSOuK1EToxr0pVm0ntR2gSvM7eIWftX8zJ/SBfl3J0tG55/hc+ya54MYGwb+Wt/Xh4/bO7qf2570v+18PDr/9NVmhGfZYJjJ9m4BBwRX2LLcCb3ONIBOB/WR8Wff7D6gNz9QfO80xljBSPOUMrKPi9DiSYO+TtHyqTu4O/KATNEVXQTgHPpnX9d1h6yUaZqyQqCwTYMwgDHIbl6AtZwKrdlQYzIGNYYQDBxVINHHZTF3Rn44Z0jTT7ihLG/a9owRpzFQmTlnPaJZ7NbmuNyhs+isuucoLi4q9BaWFoDaj9RfQIdfIrJg6AExzNytl96CBWfdR7UjhI8ukBDUsozHaahDGZYTKFBrrrHLmh5EGNXIPrBbViYYVdSQaqR/O1qib67trDdQ5/C7dkAQPo01JzvjO5XYaLm9wFfS6nfNOcHPqX1zOl7tLvpMf5JiE5IxckN/kmvQIIxPyTF7Iq3fljb2JZ96kXmvuOSIL5c3+AxkJ4Ks=</latexit><latexit sha1_base64="oXbWpaeDseZaVofi36pRPcN46Dk=">AAACwnicdVFdT9swFHXDBqxjDMbjXqxFk+ClSiokxhsSQtojSOuK1EToxr0pVm0ntR2gSvM7eIWftX8zJ/SBfl3J0tG55/hc+ya54MYGwb+Wt/Xh4/bO7qf2570v+18PDr/9NVmhGfZYJjJ9m4BBwRX2LLcCb3ONIBOB/WR8Wff7D6gNz9QfO80xljBSPOUMrKPi9DiSYO+TtHyqTu4O/KATNEVXQTgHPpnX9d1h6yUaZqyQqCwTYMwgDHIbl6AtZwKrdlQYzIGNYYQDBxVINHHZTF3Rn44Z0jTT7ihLG/a9owRpzFQmTlnPaJZ7NbmuNyhs+isuucoLi4q9BaWFoDaj9RfQIdfIrJg6AExzNytl96CBWfdR7UjhI8ukBDUsozHaahDGZYTKFBrrrHLmh5EGNXIPrBbViYYVdSQaqR/O1qib67trDdQ5/C7dkAQPo01JzvjO5XYaLm9wFfS6nfNOcHPqX1zOl7tLvpMf5JiE5IxckN/kmvQIIxPyTF7Iq3fljb2JZ96kXmvuOSIL5c3+AxkJ4Ks=</latexit><latexit sha1_base64="oXbWpaeDseZaVofi36pRPcN46Dk=">AAACwnicdVFdT9swFHXDBqxjDMbjXqxFk+ClSiokxhsSQtojSOuK1EToxr0pVm0ntR2gSvM7eIWftX8zJ/SBfl3J0tG55/hc+ya54MYGwb+Wt/Xh4/bO7qf2570v+18PDr/9NVmhGfZYJjJ9m4BBwRX2LLcCb3ONIBOB/WR8Wff7D6gNz9QfO80xljBSPOUMrKPi9DiSYO+TtHyqTu4O/KATNEVXQTgHPpnX9d1h6yUaZqyQqCwTYMwgDHIbl6AtZwKrdlQYzIGNYYQDBxVINHHZTF3Rn44Z0jTT7ihLG/a9owRpzFQmTlnPaJZ7NbmuNyhs+isuucoLi4q9BaWFoDaj9RfQIdfIrJg6AExzNytl96CBWfdR7UjhI8ukBDUsozHaahDGZYTKFBrrrHLmh5EGNXIPrBbViYYVdSQaqR/O1qib67trDdQ5/C7dkAQPo01JzvjO5XYaLm9wFfS6nfNOcHPqX1zOl7tLvpMf5JiE5IxckN/kmvQIIxPyTF7Iq3fljb2JZ96kXmvuOSIL5c3+AxkJ4Ks=</latexit>

f<latexit sha1_base64="GFQo3dgm8y5gwP32Oaf4rGpg0SE=">AAACtnicdVFNSwMxEE3Xr1q/9egluAieym4R1JvgxWMFq0J3KbPp7BqaZJckq5S1v8Cr3v1b/hvTtQe1dSDwePNe3iSTFIIbGwSfDW9peWV1rbne2tjc2t7Z3du/M3mpGfZYLnL9kIBBwRX2LLcCHwqNIBOB98noatq/f0JteK5u7bjAWEKmeMoZWEfdpINdP2gHddF5EM6AT2bVHew1PqJhzkqJyjIBxvTDoLBxBdpyJnDSikqDBbARZNh3UIFEE1f1pBN67JghTXPtjrK0Zn86KpDGjGXilBLso/nbm5KLev3SpudxxVVRWlTsOygtBbU5nT6bDrlGZsXYAWCau1kpewQNzLrPaUUKn1kuJahhFY3QTvphXEWoTKlxmlW9+GGkQWXugZPf6kTDnDoStdQPXxao6+s7Cw3UOfwO/ScJnrL/kpzxh8vtNPy7wXnQ67Qv2sHNqX95NVtukxySI3JCQnJGLsk16ZIeYQTJK3kj796FN/DQy76lXmPmOSC/yiu+AKKr26Q=</latexit><latexit sha1_base64="GFQo3dgm8y5gwP32Oaf4rGpg0SE=">AAACtnicdVFNSwMxEE3Xr1q/9egluAieym4R1JvgxWMFq0J3KbPp7BqaZJckq5S1v8Cr3v1b/hvTtQe1dSDwePNe3iSTFIIbGwSfDW9peWV1rbne2tjc2t7Z3du/M3mpGfZYLnL9kIBBwRX2LLcCHwqNIBOB98noatq/f0JteK5u7bjAWEKmeMoZWEfdpINdP2gHddF5EM6AT2bVHew1PqJhzkqJyjIBxvTDoLBxBdpyJnDSikqDBbARZNh3UIFEE1f1pBN67JghTXPtjrK0Zn86KpDGjGXilBLso/nbm5KLev3SpudxxVVRWlTsOygtBbU5nT6bDrlGZsXYAWCau1kpewQNzLrPaUUKn1kuJahhFY3QTvphXEWoTKlxmlW9+GGkQWXugZPf6kTDnDoStdQPXxao6+s7Cw3UOfwO/ScJnrL/kpzxh8vtNPy7wXnQ67Qv2sHNqX95NVtukxySI3JCQnJGLsk16ZIeYQTJK3kj796FN/DQy76lXmPmOSC/yiu+AKKr26Q=</latexit><latexit sha1_base64="GFQo3dgm8y5gwP32Oaf4rGpg0SE=">AAACtnicdVFNSwMxEE3Xr1q/9egluAieym4R1JvgxWMFq0J3KbPp7BqaZJckq5S1v8Cr3v1b/hvTtQe1dSDwePNe3iSTFIIbGwSfDW9peWV1rbne2tjc2t7Z3du/M3mpGfZYLnL9kIBBwRX2LLcCHwqNIBOB98noatq/f0JteK5u7bjAWEKmeMoZWEfdpINdP2gHddF5EM6AT2bVHew1PqJhzkqJyjIBxvTDoLBxBdpyJnDSikqDBbARZNh3UIFEE1f1pBN67JghTXPtjrK0Zn86KpDGjGXilBLso/nbm5KLev3SpudxxVVRWlTsOygtBbU5nT6bDrlGZsXYAWCau1kpewQNzLrPaUUKn1kuJahhFY3QTvphXEWoTKlxmlW9+GGkQWXugZPf6kTDnDoStdQPXxao6+s7Cw3UOfwO/ScJnrL/kpzxh8vtNPy7wXnQ67Qv2sHNqX95NVtukxySI3JCQnJGLsk16ZIeYQTJK3kj796FN/DQy76lXmPmOSC/yiu+AKKr26Q=</latexit>

f<latexit sha1_base64="GFQo3dgm8y5gwP32Oaf4rGpg0SE=">AAACtnicdVFNSwMxEE3Xr1q/9egluAieym4R1JvgxWMFq0J3KbPp7BqaZJckq5S1v8Cr3v1b/hvTtQe1dSDwePNe3iSTFIIbGwSfDW9peWV1rbne2tjc2t7Z3du/M3mpGfZYLnL9kIBBwRX2LLcCHwqNIBOB98noatq/f0JteK5u7bjAWEKmeMoZWEfdpINdP2gHddF5EM6AT2bVHew1PqJhzkqJyjIBxvTDoLBxBdpyJnDSikqDBbARZNh3UIFEE1f1pBN67JghTXPtjrK0Zn86KpDGjGXilBLso/nbm5KLev3SpudxxVVRWlTsOygtBbU5nT6bDrlGZsXYAWCau1kpewQNzLrPaUUKn1kuJahhFY3QTvphXEWoTKlxmlW9+GGkQWXugZPf6kTDnDoStdQPXxao6+s7Cw3UOfwO/ScJnrL/kpzxh8vtNPy7wXnQ67Qv2sHNqX95NVtukxySI3JCQnJGLsk16ZIeYQTJK3kj796FN/DQy76lXmPmOSC/yiu+AKKr26Q=</latexit><latexit sha1_base64="GFQo3dgm8y5gwP32Oaf4rGpg0SE=">AAACtnicdVFNSwMxEE3Xr1q/9egluAieym4R1JvgxWMFq0J3KbPp7BqaZJckq5S1v8Cr3v1b/hvTtQe1dSDwePNe3iSTFIIbGwSfDW9peWV1rbne2tjc2t7Z3du/M3mpGfZYLnL9kIBBwRX2LLcCHwqNIBOB98noatq/f0JteK5u7bjAWEKmeMoZWEfdpINdP2gHddF5EM6AT2bVHew1PqJhzkqJyjIBxvTDoLBxBdpyJnDSikqDBbARZNh3UIFEE1f1pBN67JghTXPtjrK0Zn86KpDGjGXilBLso/nbm5KLev3SpudxxVVRWlTsOygtBbU5nT6bDrlGZsXYAWCau1kpewQNzLrPaUUKn1kuJahhFY3QTvphXEWoTKlxmlW9+GGkQWXugZPf6kTDnDoStdQPXxao6+s7Cw3UOfwO/ScJnrL/kpzxh8vtNPy7wXnQ67Qv2sHNqX95NVtukxySI3JCQnJGLsk16ZIeYQTJK3kj796FN/DQy76lXmPmOSC/yiu+AKKr26Q=</latexit><latexit sha1_base64="GFQo3dgm8y5gwP32Oaf4rGpg0SE=">AAACtnicdVFNSwMxEE3Xr1q/9egluAieym4R1JvgxWMFq0J3KbPp7BqaZJckq5S1v8Cr3v1b/hvTtQe1dSDwePNe3iSTFIIbGwSfDW9peWV1rbne2tjc2t7Z3du/M3mpGfZYLnL9kIBBwRX2LLcCHwqNIBOB98noatq/f0JteK5u7bjAWEKmeMoZWEfdpINdP2gHddF5EM6AT2bVHew1PqJhzkqJyjIBxvTDoLBxBdpyJnDSikqDBbARZNh3UIFEE1f1pBN67JghTXPtjrK0Zn86KpDGjGXilBLso/nbm5KLev3SpudxxVVRWlTsOygtBbU5nT6bDrlGZsXYAWCau1kpewQNzLrPaUUKn1kuJahhFY3QTvphXEWoTKlxmlW9+GGkQWXugZPf6kTDnDoStdQPXxao6+s7Cw3UOfwO/ScJnrL/kpzxh8vtNPy7wXnQ67Qv2sHNqX95NVtukxySI3JCQnJGLsk16ZIeYQTJK3kj796FN/DQy76lXmPmOSC/yiu+AKKr26Q=</latexit>

y<latexit sha1_base64="NmkH4WbWmUEotzsNTq+cMw9UKIA=">AAACtnicdVFNSwMxEE3X7/qtRy/BRfBUdoug3gQvHhWsFrpLmU2na2iSXZKsUrb9BV717t/y35hue1DbDgQeb97Lm2SSXHBjg+C75q2srq1vbG7Vt3d29/YPDo+eTFZohi2WiUy3EzAouMKW5VZgO9cIMhH4nAxuJ/3nV9SGZ+rRDnOMJaSK9zkD66iHYffADxpBVXQehDPgk1nddw9rX1EvY4VEZZkAYzphkNu4BG05EziuR4XBHNgAUuw4qECiictq0jE9c0yP9jPtjrK0Yn87SpDGDGXilBLsi/nfm5CLep3C9q/ikqu8sKjYNKhfCGozOnk27XGNzIqhA8A0d7NS9gIamHWfU48UvrFMSlC9MhqgHXfCuIxQmULjJKsc+WGkQaXugeO/6kTDnDoSldQPRwvU1fXNhQbqHH6TLkmC13RZkjP+crmdhv83OA9azcZ1I3i48G9uZ8vdJCfklJyTkFySG3JH7kmLMILknXyQT+/a63ropVOpV5t5jsmf8vIfzsHbtw==</latexit><latexit sha1_base64="NmkH4WbWmUEotzsNTq+cMw9UKIA=">AAACtnicdVFNSwMxEE3X7/qtRy/BRfBUdoug3gQvHhWsFrpLmU2na2iSXZKsUrb9BV717t/y35hue1DbDgQeb97Lm2SSXHBjg+C75q2srq1vbG7Vt3d29/YPDo+eTFZohi2WiUy3EzAouMKW5VZgO9cIMhH4nAxuJ/3nV9SGZ+rRDnOMJaSK9zkD66iHYffADxpBVXQehDPgk1nddw9rX1EvY4VEZZkAYzphkNu4BG05EziuR4XBHNgAUuw4qECiictq0jE9c0yP9jPtjrK0Yn87SpDGDGXilBLsi/nfm5CLep3C9q/ikqu8sKjYNKhfCGozOnk27XGNzIqhA8A0d7NS9gIamHWfU48UvrFMSlC9MhqgHXfCuIxQmULjJKsc+WGkQaXugeO/6kTDnDoSldQPRwvU1fXNhQbqHH6TLkmC13RZkjP+crmdhv83OA9azcZ1I3i48G9uZ8vdJCfklJyTkFySG3JH7kmLMILknXyQT+/a63ropVOpV5t5jsmf8vIfzsHbtw==</latexit><latexit sha1_base64="NmkH4WbWmUEotzsNTq+cMw9UKIA=">AAACtnicdVFNSwMxEE3X7/qtRy/BRfBUdoug3gQvHhWsFrpLmU2na2iSXZKsUrb9BV717t/y35hue1DbDgQeb97Lm2SSXHBjg+C75q2srq1vbG7Vt3d29/YPDo+eTFZohi2WiUy3EzAouMKW5VZgO9cIMhH4nAxuJ/3nV9SGZ+rRDnOMJaSK9zkD66iHYffADxpBVXQehDPgk1nddw9rX1EvY4VEZZkAYzphkNu4BG05EziuR4XBHNgAUuw4qECiictq0jE9c0yP9jPtjrK0Yn87SpDGDGXilBLsi/nfm5CLep3C9q/ikqu8sKjYNKhfCGozOnk27XGNzIqhA8A0d7NS9gIamHWfU48UvrFMSlC9MhqgHXfCuIxQmULjJKsc+WGkQaXugeO/6kTDnDoSldQPRwvU1fXNhQbqHH6TLkmC13RZkjP+crmdhv83OA9azcZ1I3i48G9uZ8vdJCfklJyTkFySG3JH7kmLMILknXyQT+/a63ropVOpV5t5jsmf8vIfzsHbtw==</latexit>

Page 38: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Tensors in Machine Learning

Where can tensors appear in machine learning applications?

Page 39: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Multi-Dimensional Data

r,g,b value

x coord

y coord

Figure 4: (Left) Eight ILSVRC-2010 test images and the five labels considered most probable by our model.The correct label is written under each image, and the probability assigned to the correct label is also shownwith a red bar (if it happens to be in the top 5). (Right) Five ILSVRC-2010 test images in the first column. Theremaining columns show the six training images that produce feature vectors in the last hidden layer with thesmallest Euclidean distance from the feature vector for the test image.

In the left panel of Figure 4 we qualitatively assess what the network has learned by computing itstop-5 predictions on eight test images. Notice that even off-center objects, such as the mite in thetop-left, can be recognized by the net. Most of the top-5 labels appear reasonable. For example,only other types of cat are considered plausible labels for the leopard. In some cases (grille, cherry)there is genuine ambiguity about the intended focus of the photograph.

Another way to probe the network’s visual knowledge is to consider the feature activations inducedby an image at the last, 4096-dimensional hidden layer. If two images produce feature activationvectors with a small Euclidean separation, we can say that the higher levels of the neural networkconsider them to be similar. Figure 4 shows five images from the test set and the six images fromthe training set that are most similar to each of them according to this measure. Notice that at thepixel level, the retrieved training images are generally not close in L2 to the query images in the firstcolumn. For example, the retrieved dogs and elephants appear in a variety of poses. We present theresults for many more test images in the supplementary material.

Computing similarity by using Euclidean distance between two 4096-dimensional, real-valued vec-tors is inefficient, but it could be made efficient by training an auto-encoder to compress these vectorsto short binary codes. This should produce a much better image retrieval method than applying auto-encoders to the raw pixels [14], which does not make use of image labels and hence has a tendencyto retrieve images with similar patterns of edges, whether or not they are semantically similar.

7 Discussion

Our results show that a large, deep convolutional neural network is capable of achieving record-breaking results on a highly challenging dataset using purely supervised learning. It is notablethat our network’s performance degrades if a single convolutional layer is removed. For example,removing any of the middle layers results in a loss of about 2% for the top-1 performance of thenetwork. So the depth really is important for achieving our results.

To simplify our experiments, we did not use any unsupervised pre-training even though we expectthat it will help, especially if we obtain enough computational power to significantly increase thesize of the network without obtaining a corresponding increase in the amount of labeled data. Thusfar, our results have improved as we have made our network larger and trained it longer but we stillhave many orders of magnitude to go in order to match the infero-temporal pathway of the humanvisual system. Ultimately we would like to use very large and deep convolutional nets on videosequences where the temporal structure provides very helpful information that is missing or far lessobvious in static images.

8

Figure 4: (Left) Eight ILSVRC-2010 test images and the five labels considered most probable by our model.The correct label is written under each image, and the probability assigned to the correct label is also shownwith a red bar (if it happens to be in the top 5). (Right) Five ILSVRC-2010 test images in the first column. Theremaining columns show the six training images that produce feature vectors in the last hidden layer with thesmallest Euclidean distance from the feature vector for the test image.

In the left panel of Figure 4 we qualitatively assess what the network has learned by computing itstop-5 predictions on eight test images. Notice that even off-center objects, such as the mite in thetop-left, can be recognized by the net. Most of the top-5 labels appear reasonable. For example,only other types of cat are considered plausible labels for the leopard. In some cases (grille, cherry)there is genuine ambiguity about the intended focus of the photograph.

Another way to probe the network’s visual knowledge is to consider the feature activations inducedby an image at the last, 4096-dimensional hidden layer. If two images produce feature activationvectors with a small Euclidean separation, we can say that the higher levels of the neural networkconsider them to be similar. Figure 4 shows five images from the test set and the six images fromthe training set that are most similar to each of them according to this measure. Notice that at thepixel level, the retrieved training images are generally not close in L2 to the query images in the firstcolumn. For example, the retrieved dogs and elephants appear in a variety of poses. We present theresults for many more test images in the supplementary material.

Computing similarity by using Euclidean distance between two 4096-dimensional, real-valued vec-tors is inefficient, but it could be made efficient by training an auto-encoder to compress these vectorsto short binary codes. This should produce a much better image retrieval method than applying auto-encoders to the raw pixels [14], which does not make use of image labels and hence has a tendencyto retrieve images with similar patterns of edges, whether or not they are semantically similar.

7 Discussion

Our results show that a large, deep convolutional neural network is capable of achieving record-breaking results on a highly challenging dataset using purely supervised learning. It is notablethat our network’s performance degrades if a single convolutional layer is removed. For example,removing any of the middle layers results in a loss of about 2% for the top-1 performance of thenetwork. So the depth really is important for achieving our results.

To simplify our experiments, we did not use any unsupervised pre-training even though we expectthat it will help, especially if we obtain enough computational power to significantly increase thesize of the network without obtaining a corresponding increase in the amount of labeled data. Thusfar, our results have improved as we have made our network larger and trained it longer but we stillhave many orders of magnitude to go in order to match the infero-temporal pathway of the humanvisual system. Ultimately we would like to use very large and deep convolutional nets on videosequences where the temporal structure provides very helpful information that is missing or far lessobvious in static images.

8

Figure 4: (Left) Eight ILSVRC-2010 test images and the five labels considered most probable by our model.The correct label is written under each image, and the probability assigned to the correct label is also shownwith a red bar (if it happens to be in the top 5). (Right) Five ILSVRC-2010 test images in the first column. Theremaining columns show the six training images that produce feature vectors in the last hidden layer with thesmallest Euclidean distance from the feature vector for the test image.

In the left panel of Figure 4 we qualitatively assess what the network has learned by computing itstop-5 predictions on eight test images. Notice that even off-center objects, such as the mite in thetop-left, can be recognized by the net. Most of the top-5 labels appear reasonable. For example,only other types of cat are considered plausible labels for the leopard. In some cases (grille, cherry)there is genuine ambiguity about the intended focus of the photograph.

Another way to probe the network’s visual knowledge is to consider the feature activations inducedby an image at the last, 4096-dimensional hidden layer. If two images produce feature activationvectors with a small Euclidean separation, we can say that the higher levels of the neural networkconsider them to be similar. Figure 4 shows five images from the test set and the six images fromthe training set that are most similar to each of them according to this measure. Notice that at thepixel level, the retrieved training images are generally not close in L2 to the query images in the firstcolumn. For example, the retrieved dogs and elephants appear in a variety of poses. We present theresults for many more test images in the supplementary material.

Computing similarity by using Euclidean distance between two 4096-dimensional, real-valued vec-tors is inefficient, but it could be made efficient by training an auto-encoder to compress these vectorsto short binary codes. This should produce a much better image retrieval method than applying auto-encoders to the raw pixels [14], which does not make use of image labels and hence has a tendencyto retrieve images with similar patterns of edges, whether or not they are semantically similar.

7 Discussion

Our results show that a large, deep convolutional neural network is capable of achieving record-breaking results on a highly challenging dataset using purely supervised learning. It is notablethat our network’s performance degrades if a single convolutional layer is removed. For example,removing any of the middle layers results in a loss of about 2% for the top-1 performance of thenetwork. So the depth really is important for achieving our results.

To simplify our experiments, we did not use any unsupervised pre-training even though we expectthat it will help, especially if we obtain enough computational power to significantly increase thesize of the network without obtaining a corresponding increase in the amount of labeled data. Thusfar, our results have improved as we have made our network larger and trained it longer but we stillhave many orders of magnitude to go in order to match the infero-temporal pathway of the humanvisual system. Ultimately we would like to use very large and deep convolutional nets on videosequences where the temporal structure provides very helpful information that is missing or far lessobvious in static images.

8

Image Data

Page 40: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Multi-Dimensional Data

r,g,b value

x coord

y coord

Figure 4: (Left) Eight ILSVRC-2010 test images and the five labels considered most probable by our model.The correct label is written under each image, and the probability assigned to the correct label is also shownwith a red bar (if it happens to be in the top 5). (Right) Five ILSVRC-2010 test images in the first column. Theremaining columns show the six training images that produce feature vectors in the last hidden layer with thesmallest Euclidean distance from the feature vector for the test image.

In the left panel of Figure 4 we qualitatively assess what the network has learned by computing itstop-5 predictions on eight test images. Notice that even off-center objects, such as the mite in thetop-left, can be recognized by the net. Most of the top-5 labels appear reasonable. For example,only other types of cat are considered plausible labels for the leopard. In some cases (grille, cherry)there is genuine ambiguity about the intended focus of the photograph.

Another way to probe the network’s visual knowledge is to consider the feature activations inducedby an image at the last, 4096-dimensional hidden layer. If two images produce feature activationvectors with a small Euclidean separation, we can say that the higher levels of the neural networkconsider them to be similar. Figure 4 shows five images from the test set and the six images fromthe training set that are most similar to each of them according to this measure. Notice that at thepixel level, the retrieved training images are generally not close in L2 to the query images in the firstcolumn. For example, the retrieved dogs and elephants appear in a variety of poses. We present theresults for many more test images in the supplementary material.

Computing similarity by using Euclidean distance between two 4096-dimensional, real-valued vec-tors is inefficient, but it could be made efficient by training an auto-encoder to compress these vectorsto short binary codes. This should produce a much better image retrieval method than applying auto-encoders to the raw pixels [14], which does not make use of image labels and hence has a tendencyto retrieve images with similar patterns of edges, whether or not they are semantically similar.

7 Discussion

Our results show that a large, deep convolutional neural network is capable of achieving record-breaking results on a highly challenging dataset using purely supervised learning. It is notablethat our network’s performance degrades if a single convolutional layer is removed. For example,removing any of the middle layers results in a loss of about 2% for the top-1 performance of thenetwork. So the depth really is important for achieving our results.

To simplify our experiments, we did not use any unsupervised pre-training even though we expectthat it will help, especially if we obtain enough computational power to significantly increase thesize of the network without obtaining a corresponding increase in the amount of labeled data. Thusfar, our results have improved as we have made our network larger and trained it longer but we stillhave many orders of magnitude to go in order to match the infero-temporal pathway of the humanvisual system. Ultimately we would like to use very large and deep convolutional nets on videosequences where the temporal structure provides very helpful information that is missing or far lessobvious in static images.

8

Figure 4: (Left) Eight ILSVRC-2010 test images and the five labels considered most probable by our model.The correct label is written under each image, and the probability assigned to the correct label is also shownwith a red bar (if it happens to be in the top 5). (Right) Five ILSVRC-2010 test images in the first column. Theremaining columns show the six training images that produce feature vectors in the last hidden layer with thesmallest Euclidean distance from the feature vector for the test image.

In the left panel of Figure 4 we qualitatively assess what the network has learned by computing itstop-5 predictions on eight test images. Notice that even off-center objects, such as the mite in thetop-left, can be recognized by the net. Most of the top-5 labels appear reasonable. For example,only other types of cat are considered plausible labels for the leopard. In some cases (grille, cherry)there is genuine ambiguity about the intended focus of the photograph.

Another way to probe the network’s visual knowledge is to consider the feature activations inducedby an image at the last, 4096-dimensional hidden layer. If two images produce feature activationvectors with a small Euclidean separation, we can say that the higher levels of the neural networkconsider them to be similar. Figure 4 shows five images from the test set and the six images fromthe training set that are most similar to each of them according to this measure. Notice that at thepixel level, the retrieved training images are generally not close in L2 to the query images in the firstcolumn. For example, the retrieved dogs and elephants appear in a variety of poses. We present theresults for many more test images in the supplementary material.

Computing similarity by using Euclidean distance between two 4096-dimensional, real-valued vec-tors is inefficient, but it could be made efficient by training an auto-encoder to compress these vectorsto short binary codes. This should produce a much better image retrieval method than applying auto-encoders to the raw pixels [14], which does not make use of image labels and hence has a tendencyto retrieve images with similar patterns of edges, whether or not they are semantically similar.

7 Discussion

Our results show that a large, deep convolutional neural network is capable of achieving record-breaking results on a highly challenging dataset using purely supervised learning. It is notablethat our network’s performance degrades if a single convolutional layer is removed. For example,removing any of the middle layers results in a loss of about 2% for the top-1 performance of thenetwork. So the depth really is important for achieving our results.

To simplify our experiments, we did not use any unsupervised pre-training even though we expectthat it will help, especially if we obtain enough computational power to significantly increase thesize of the network without obtaining a corresponding increase in the amount of labeled data. Thusfar, our results have improved as we have made our network larger and trained it longer but we stillhave many orders of magnitude to go in order to match the infero-temporal pathway of the humanvisual system. Ultimately we would like to use very large and deep convolutional nets on videosequences where the temporal structure provides very helpful information that is missing or far lessobvious in static images.

8

Figure 4: (Left) Eight ILSVRC-2010 test images and the five labels considered most probable by our model.The correct label is written under each image, and the probability assigned to the correct label is also shownwith a red bar (if it happens to be in the top 5). (Right) Five ILSVRC-2010 test images in the first column. Theremaining columns show the six training images that produce feature vectors in the last hidden layer with thesmallest Euclidean distance from the feature vector for the test image.

In the left panel of Figure 4 we qualitatively assess what the network has learned by computing itstop-5 predictions on eight test images. Notice that even off-center objects, such as the mite in thetop-left, can be recognized by the net. Most of the top-5 labels appear reasonable. For example,only other types of cat are considered plausible labels for the leopard. In some cases (grille, cherry)there is genuine ambiguity about the intended focus of the photograph.

Another way to probe the network’s visual knowledge is to consider the feature activations inducedby an image at the last, 4096-dimensional hidden layer. If two images produce feature activationvectors with a small Euclidean separation, we can say that the higher levels of the neural networkconsider them to be similar. Figure 4 shows five images from the test set and the six images fromthe training set that are most similar to each of them according to this measure. Notice that at thepixel level, the retrieved training images are generally not close in L2 to the query images in the firstcolumn. For example, the retrieved dogs and elephants appear in a variety of poses. We present theresults for many more test images in the supplementary material.

Computing similarity by using Euclidean distance between two 4096-dimensional, real-valued vec-tors is inefficient, but it could be made efficient by training an auto-encoder to compress these vectorsto short binary codes. This should produce a much better image retrieval method than applying auto-encoders to the raw pixels [14], which does not make use of image labels and hence has a tendencyto retrieve images with similar patterns of edges, whether or not they are semantically similar.

7 Discussion

Our results show that a large, deep convolutional neural network is capable of achieving record-breaking results on a highly challenging dataset using purely supervised learning. It is notablethat our network’s performance degrades if a single convolutional layer is removed. For example,removing any of the middle layers results in a loss of about 2% for the top-1 performance of thenetwork. So the depth really is important for achieving our results.

To simplify our experiments, we did not use any unsupervised pre-training even though we expectthat it will help, especially if we obtain enough computational power to significantly increase thesize of the network without obtaining a corresponding increase in the amount of labeled data. Thusfar, our results have improved as we have made our network larger and trained it longer but we stillhave many orders of magnitude to go in order to match the infero-temporal pathway of the humanvisual system. Ultimately we would like to use very large and deep convolutional nets on videosequences where the temporal structure provides very helpful information that is missing or far lessobvious in static images.

8

Image Data

Medical Data

height weight symptomage

Page 41: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Neural Network Weight Layers

} !

Possible to interpret as a very high-order tensor (not just a matrix)

Page 42: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Weights of Discriminative, Generative Models

For certain cases of kernel learning and Gaussian processes, weights are naturally a high-order tensor

�<latexit sha1_base64="NOAZLpph3dRcDSdoAvlwSUmR5jI=">AAACunicdVHBThsxEHUWaGmAlsCRi9UVEqdoNyDRQw9IuXCkUgNI2RWadSaJFdu7smepoiW/0Gt75Lf4G7xLDkDCSJae3rznN/ZkhZKOouipFWxsbn36vP2lvbO79/Xbfufg2uWlFTgQucrtbQYOlTQ4IEkKbwuLoDOFN9msX/dv7tE6mZvfNC8w1TAxciwFUE0lxVTe7YdRN2qKr4J4CUK2rKu7TusxGeWi1GhIKHBuGEcFpRVYkkLhop2UDgsQM5jg0EMDGl1aNcMu+LFnRnycW38M8YZ97ahAOzfXmVdqoKl736vJdb1hSeMfaSVNURIa8RI0LhWnnNcv5yNpUZCaewDCSj8rF1OwIMj/Tzsx+EfkWoMZVckMaTGM0ypB40qLdVb1EMaJBTPxD1y8VWcWVtSJaqRh/LBG3VzfW2vg3hH2+AdJcD/5KMkbX7n8TuP3G1wF171ufNrt/ToLL/rL7W6zI/adnbCYnbMLdsmu2IAJNmV/2T/2P/gZZIEMZi/SoLX0HLI3FdAzzDXdKg==</latexit>

�<latexit sha1_base64="NOAZLpph3dRcDSdoAvlwSUmR5jI=">AAACunicdVHBThsxEHUWaGmAlsCRi9UVEqdoNyDRQw9IuXCkUgNI2RWadSaJFdu7smepoiW/0Gt75Lf4G7xLDkDCSJae3rznN/ZkhZKOouipFWxsbn36vP2lvbO79/Xbfufg2uWlFTgQucrtbQYOlTQ4IEkKbwuLoDOFN9msX/dv7tE6mZvfNC8w1TAxciwFUE0lxVTe7YdRN2qKr4J4CUK2rKu7TusxGeWi1GhIKHBuGEcFpRVYkkLhop2UDgsQM5jg0EMDGl1aNcMu+LFnRnycW38M8YZ97ahAOzfXmVdqoKl736vJdb1hSeMfaSVNURIa8RI0LhWnnNcv5yNpUZCaewDCSj8rF1OwIMj/Tzsx+EfkWoMZVckMaTGM0ypB40qLdVb1EMaJBTPxD1y8VWcWVtSJaqRh/LBG3VzfW2vg3hH2+AdJcD/5KMkbX7n8TuP3G1wF171ufNrt/ToLL/rL7W6zI/adnbCYnbMLdsmu2IAJNmV/2T/2P/gZZIEMZi/SoLX0HLI3FdAzzDXdKg==</latexit>

�<latexit sha1_base64="NOAZLpph3dRcDSdoAvlwSUmR5jI=">AAACunicdVHBThsxEHUWaGmAlsCRi9UVEqdoNyDRQw9IuXCkUgNI2RWadSaJFdu7smepoiW/0Gt75Lf4G7xLDkDCSJae3rznN/ZkhZKOouipFWxsbn36vP2lvbO79/Xbfufg2uWlFTgQucrtbQYOlTQ4IEkKbwuLoDOFN9msX/dv7tE6mZvfNC8w1TAxciwFUE0lxVTe7YdRN2qKr4J4CUK2rKu7TusxGeWi1GhIKHBuGEcFpRVYkkLhop2UDgsQM5jg0EMDGl1aNcMu+LFnRnycW38M8YZ97ahAOzfXmVdqoKl736vJdb1hSeMfaSVNURIa8RI0LhWnnNcv5yNpUZCaewDCSj8rF1OwIMj/Tzsx+EfkWoMZVckMaTGM0ypB40qLdVb1EMaJBTPxD1y8VWcWVtSJaqRh/LBG3VzfW2vg3hH2+AdJcD/5KMkbX7n8TuP3G1wF171ufNrt/ToLL/rL7W6zI/adnbCYnbMLdsmu2IAJNmV/2T/2P/gZZIEMZi/SoLX0HLI3FdAzzDXdKg==</latexit>

�<latexit sha1_base64="NOAZLpph3dRcDSdoAvlwSUmR5jI=">AAACunicdVHBThsxEHUWaGmAlsCRi9UVEqdoNyDRQw9IuXCkUgNI2RWadSaJFdu7smepoiW/0Gt75Lf4G7xLDkDCSJae3rznN/ZkhZKOouipFWxsbn36vP2lvbO79/Xbfufg2uWlFTgQucrtbQYOlTQ4IEkKbwuLoDOFN9msX/dv7tE6mZvfNC8w1TAxciwFUE0lxVTe7YdRN2qKr4J4CUK2rKu7TusxGeWi1GhIKHBuGEcFpRVYkkLhop2UDgsQM5jg0EMDGl1aNcMu+LFnRnycW38M8YZ97ahAOzfXmVdqoKl736vJdb1hSeMfaSVNURIa8RI0LhWnnNcv5yNpUZCaewDCSj8rF1OwIMj/Tzsx+EfkWoMZVckMaTGM0ypB40qLdVb1EMaJBTPxD1y8VWcWVtSJaqRh/LBG3VzfW2vg3hH2+AdJcD/5KMkbX7n8TuP3G1wF171ufNrt/ToLL/rL7W6zI/adnbCYnbMLdsmu2IAJNmV/2T/2P/gZZIEMZi/SoLX0HLI3FdAzzDXdKg==</latexit>

f(x) = W · �(x)<latexit sha1_base64="rcN1s7auraIareKAmUvpw50xJjw=">AAAC4HicdVHLbtNAFJ24QEt4pbBkM8JCCpvIDkiwQarUDcsgkaZSxorG4+tklHlYM+PSyPWeVatuWfERbOE/+BvGbhZpk15ppKNzz5n7SgvBrYuif51g78HDR/sHj7tPnj57/qJ3+PLE6tIwGDMttDlNqQXBFYwddwJOCwNUpgIm6fK4yU/OwFiu1Te3KiCRdK54zhl1npr1+nmfSOoWaV6d1+/wZzzBhGXaYTJa8M3UrBdGg6gNvA3iNQjROkazw84vkmlWSlCOCWrtNI4Kl1TUOM4E1F1SWigoW9I5TD1UVIJNqnakGr/1TIZzbfxTDrfspqOi0tqVTL2y6dHezTXkrty0dPmnpOKqKB0odlMoLwV2Gjf7wRk3wJxYeUCZ4b5XzBbUUOb8FrtEwXempaQqq8gSXD2Nk4qAsqWBplZ1EcbEUDX3A9a31amhW2oiWmkYX+xQt98Pdxqwd4RDfE8leja/r5I3brj8TeO7F9wGJ8NB/H4w/PohPDpeX/cAvUZvUB/F6CM6Ql/QCI0RQ5foN/qD/gZp8CO4Cq5vpEFn7XmFbkXw8z/JcetB</latexit>

W

x1<latexit sha1_base64="AktOgWS5eMuPbInVEwYtYaFIomY=">AAACuXicdVFNTxsxEHWW77RQPo5cLFaVeorWKVKRuCBx4QgqAaTsKpp1Jhs3tndlewPRkp/AlV77t/g3dZYcgISRLD29ec9v7EkLKayLopdGsLK6tr6xudX88nV759vu3v6NzUvDscNzmZu7FCxKobHjhJN4VxgElUq8TUfns/7tGI0Vub52kwITBZkWA8HBeer3Q4/1dsOoFdVFFwGbg5DM67K31/gX93NeKtSOS7C2y6LCJRUYJ7jEaTMuLRbAR5Bh10MNCm1S1bNO6XfP9OkgN/5oR2v2raMCZe1EpV6pwA3tx96MXNbrlm5wklRCF6VDzV+DBqWkLqezh9O+MMidnHgA3Ag/K+VDMMCd/55mrPGe50qB7lfxCN20y5IqRm1Lg7Os6jFksQGd+QdO36tTAwvqWNbSkD0uUdfXt5caqHeEbfpJEoyzz5K88Y3L75R93OAiuGm32M9W++o4PDufb3eTHJIj8oMw8ouckQtySTqEk4w8kWfyNzgNIBgGf16lQWPuOSDvKrD/AUsT3Is=</latexit>

x2<latexit sha1_base64="alLs50dCeMBpMXO83MAN5HeTDTY=">AAACuXicdVFNT+MwEHXDx0JZvo9cLCKkPVVJFomV9oLEhSMICkhNVE3caeqt7US206UK/Qlc4crf4t/ghh6AlpEsPb15z2/sSQvBjQ2C14a3tLyy+mNtvbnxc3Nre2d378bkpWbYZrnI9V0KBgVX2LbcCrwrNIJMBd6mw7Np/3aE2vBcXdtxgYmETPE+Z2AddXXfjbo7ftAK6qLzIJwBn8zqorvbeIl7OSslKssEGNMJg8ImFWjLmcBJMy4NFsCGkGHHQQUSTVLVs07okWN6tJ9rd5SlNfvRUYE0ZixTp5RgB+Zrb0ou6nVK2/+TVFwVpUXF3oP6paA2p9OH0x7XyKwYOwBMczcrZQPQwKz7nmas8D/LpQTVq+Ih2kknTKoYlSk1TrOqBz+MNajMPXDyWZ1qmFPHopb64cMCdX19tNBAncOP6DdJMMq+S3LGDy630/DrBufBTdQKf7eiy2P/9Gy23TVyQA7JLxKSE3JKzskFaRNGMvJInsiz99cDb+D9e5d6jZlnn3wqz7wBTWbcjA==</latexit>

x3<latexit sha1_base64="mHwZ3g8WrtolGMYS7acZ2k6Aam0=">AAACuXicdVFNT+MwEHXDd/lY2D1ysYiQOFVJQWIlLpV64chqKSA1UTVxp6mp7US2U6hCfwLX5bp/i3+DG3oAWkay9PTmPb+xJ8kFNzYIXmveyura+sbmVn17Z3fvx/7BzxuTFZphh2Ui03cJGBRcYcdyK/Au1wgyEXibjNqz/u0YteGZuraTHGMJqeIDzsA66u9j77S37weNoCq6CMI58Mm8rnoHtf9RP2OFRGWZAGO6YZDbuARtORM4rUeFwRzYCFLsOqhAoonLatYpPXZMnw4y7Y6ytGI/OkqQxkxk4pQS7NB87c3IZb1uYQe/45KrvLCo2HvQoBDUZnT2cNrnGpkVEweAae5mpWwIGph131OPFD6wTEpQ/TIaoZ12w7iMUJlC4yyrfPLDSINK3QOnn9WJhgV1JCqpHz4tUVfXN5caqHP4TfpNEozT75Kc8YPL7TT8usFFcNNshKeN5p8zv9Web3eTHJIjckJCck5a5JJckQ5hJCXP5B958S488Ibe/bvUq809v8in8swbT7ncjQ==</latexit>

x4<latexit sha1_base64="tUr/cMdFwfwb/1/5OPc+VEwKTus=">AAACuXicdVFNT+MwEHXDLh9dFigcuVgbrcSpSrqVQOJSiQtHEBSQmqiauNPUW9uJbAeoQn8CV7jyt/g3uKEHoGUkS09v3vMbe5JccGOD4LXmrfz4ubq2vlH/tfl7a3unsXtlskIz7LJMZPomAYOCK+xabgXe5BpBJgKvk/HJrH99i9rwTF3aSY6xhFTxIWdgHXVx32/3d/ygGVRFF0E4Bz6Z11m/UXuJBhkrJCrLBBjTC4PcxiVoy5nAaT0qDObAxpBiz0EFEk1cVrNO6V/HDOgw0+4oSyv2o6MEacxEJk4pwY7M196MXNbrFXZ4FJdc5YVFxd6DhoWgNqOzh9MB18ismDgATHM3K2Uj0MCs+556pPCOZVKCGpTRGO20F8ZlhMoUGmdZ5YMfRhpU6h44/axONCyoI1FJ/fBhibq6vrXUQJ3Db9FvkuA2/S7JGT+43E7DrxtcBFetZviv2Tpv+52T+XbXyT75Qw5ISA5Jh5ySM9IljKTkkTyRZ+/YA2/k/X+XerW5Z498Ks+8AVIM3I4=</latexit>

Page 43: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Why Tensor Networks?

Page 44: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Tensor network = factorization of huge tensor into contracted product of smaller tensors

=

Page 45: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Tensor network = factorization of huge tensor into contracted product of smaller tensors

=

Benefits: • exponential reduction in memory needed • exponential speedup of computations (addition, product) • theoretical insight and interpretation • estimation of missing or corrupted entries • many optimization algorithms & strategies

Page 46: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

N-index tensor represented as shape with N lines

s1 s2 s3 s4 sN

Notation – Tensor Diagrams

T s1s2s3···sN =

<latexit sha1_base64="LMCdMHHJm6X87ZnRCj0E2q48wOM=">AAAC1XicdVFNaxRBEO0dv+L6tdGjCI2D4GmZXgPqQQjk4kkiZLOBnXGo6andNNsfQ3dPZJnMSfHqX/DqVX+N/8beyR6S7Kag4PHqva6uqqKSwvkk+deLbt2+c/fezv3+g4ePHj8Z7D49dqa2HMfcSGNPCnAohcaxF17iSWURVCFxUiwOVvXJGVonjD7yywozBXMtZoKDD1Q+eHH0pXE5oy4fhXxDU14a7wL81NIPNB/EyTDpgm4CtgYxWcdhvtv7lZaG1wq15xKcm7Kk8lkD1gsuse2ntcMK+ALmOA1Qg0KXNd0cLX0VmJLOjA2pPe3Yy44GlHNLVQSlAn/qrtdW5LbatPazd1kjdFV71Pyi0ayW1Bu6WgothUXu5TIA4FaEv1J+Cha4D6vrpxq/cqMU6LJJF+jbKcuaFLWrLa56NecxSy3oeRiwvaouLGyoU9lJY3a+Rd09P9pqoMERj+gNneBsflOnYLzkCjdl1y+4CY5HQ7Y3fP95L94/WF93hzwnL8lrwshbsk8+kkMyJpx8I7/JH/I3mkRt9D36cSGNemvPM3Ilop//AXOc5gw=</latexit>

Page 47: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Diagrams for low-order tensors

vjj

Page 48: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Diagrams for low-order tensors

vjj

j Miji

Page 49: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Diagrams for low-order tensors

vjj

j Miji

j

i Tijkk

Page 50: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Joining lines implies contraction, can omit names

X

j

Mijvjji

Page 51: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Joining lines implies contraction, can omit names

X

j

Mijvjji

Page 52: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Joining lines implies contraction, can omit names

X

j

Mijvjji

AijBjk = AB

Page 53: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Joining lines implies contraction, can omit names

X

j

Mijvjji

AijBjk = AB

Page 54: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Joining lines implies contraction, can omit names

X

j

Mijvjji

AijBjk = AB

AijBji = Tr[AB]

Page 55: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

=

Best understood tensor network in physics is the matrix product state (MPS)1,2

Known as tensor train in applied math literature 3

[1] Östlund, Rommer, PRL 75, 3537 (1995)

[2] Vidal, PRL 91, 147902 (2003)

[3] Oseledets, SIAM J. Sci. Comp. 33, 2295 (2011)

Page 56: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Name matrix product state refers to retrieving elements:

Page 57: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Name matrix product state refers to retrieving elements:

0 1 1 0 1

Page 58: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Name matrix product state refers to retrieving elements:

0 1 1 0 1

=

Page 59: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Name matrix product state refers to retrieving elements:

0 1 1 0 1

==

Page 60: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Name matrix product state refers to retrieving elements:

0 1 1 0 1

==

=

Page 61: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Name matrix product state refers to retrieving elements:

0 1 1 0 1

==

=

=

Page 62: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Name matrix product state refers to retrieving elements:

0 1 1 0 1

==

=

=

=

Page 63: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Name matrix product state refers to retrieving elements:

0 1 1 0 1

==

=

=

=

! T 01101

Page 64: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

=

Adjustable parameter of matrix product state (MPS) is bond dimension

If modest yields good approximation, obtain massive compression:

!

! ! ! ! ! ! ! ! !

d d d d d d d d d d d d d d d d d d d d

!

N N d !2d

Page 65: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Can efficiently sum MPS compressed form:

Or multiply by other networks:

+ =

=

Page 66: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

MPS can be perfectly sampled (no autocorrelation):

0 0 0 0 0 00 1 0 0 1 00 1 1 0 0 00 0 1 0 0 11 1 0 1 0 10 0 0 0 0 00 1 1 0 0 0

Page 67: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

In quantum physics context, have rich theory of which tensor networks are suited for particular "data"

(Here "data" = samples/measurements of a quantum wavefunction)

1D system 1D critical system 2D system

Page 68: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Last but not least, can pull off impressive calculations

capture wavefunctions of 1000's of atoms2

FIG. 2. Electronic density in the y � z plane of a linearchain of 10 hydrogen atoms, equally spaced at a near neighbordistance R = 2.4 a.u., calculated in a sliced cc-pVDZ basis(with No = 4). A dimerization pattern is visible, inducedby the open ends of the chain, but representing the strongtendency to dimerize into H2 molecules.

“z” direction we use a grid. This grid direction is cho-sen to be the direction over which the molecule extendsfurthest. At each grid point, the remaining transversedimensions, x and y, are captured by a small numberof basis functions derived from standard Gaussian basissets, making what we call a “slice”—see Fig. 1. The totalnumber of DMRG “sites” is therefore Nb = NzNo, whereNz is the number of grid points, and No is the numberof transverse functions (“orbitals”) per grid point. TheDMRG path progresses through all orbitals on a slice,then moves to the next. This approach has several ma-jor advantages. First, all interaction terms Vijkl where iand l are not on the same slice are zero, and similarly forj and k. Thus the number of terms scales as N2

z . Sec-ond, the remaining interactions can be compressed verye�ciently, making the dominant part of the calculationtime linear in Nz. Third, since there is no spatial extentof the basis functions in the z direction, there is no extraentanglement due to nonlocality, potentially reducing thenumber of states m needed for a given accuracy.

We demonstrate our method by simulating linearchains of hydrogen atoms. Although these are three-dimensional systems, their linear nature makes them es-pecially well suited for both SBDMRG and QCDMRG.They also exhibit strong correlation, and can be quitechallenging for electronic structure methods. The elec-tronic density in a plane through the nuclei for a typicalcalculation is presented in Fig. 2.

To define the sliced basis approach in detail, considerthe electronic structure Hamiltonian for fixed nuclei inatomic units

Hel =

Z

r †

�(r)

�1

2r2 + v(r)

� �(r)

+1

2

Z

r,r0

1

|r � r0| †�(r) †

�0(r0) �0(r0) �(r) . (2)

Summation over spin labels � is implied above and inwhat follows, and v(r) is the single particle potential gen-erated by the nuclei.

Along the z direction, we make a grid approximationby taking zn = n ·a with n an integer and a a smallgrid spacing. Then on each slice n, we introduce afinite, orthonormal basis of functions {�j(x, y)} wherej = 1, 2, . . . , No. For simplicity, we use the same No and

functions {�j(x, y)} on every slice n. At a later stageone can perform a change of basis to adapt the basis foreach slice, possibly reducing the number of functions. Weintroduce discrete operators c†

nj� and cnj� which createand destroy electrons in a slice orbital. In terms of theseoperators, the discretized Hamiltonian takes the form

H =1

2

X

nn0

X

ij

tnn0

ij c†ni� cn0j� (3)

+1

2

X

nn0

X

ijkl

V nn0

ijkl c†ni� c†

n0j�0 cn0k�0 cnl� . (4)

Introducing the notation ⇢ = (x, y) for convenience, theinteraction integrals are defined as

V nn0

ijkl =

Z

⇢,⇢0

�i(⇢)�j(⇢0)�k(⇢0)�l(⇢)p|⇢ � ⇢0|2 + (zn � zn0)2

. (5)

Note that the i, j, k, l indices only run over the small num-ber of functions No on each slice. Thus, the Hamiltonianis defined by just N2

z N4o interaction integrals. The single-

particle couplings are defined to be

tnn0

ij = �nn0

Z

⇢�i(⇢)

�1

2r2

⇢ + v(⇢, zn)

��j(⇢) (6)

� �ij1

2a2�nn0 . (7)

Our discrete Hamiltonian treats the z-direction ki-netic energy terms Eq. (7) on a di↵erent footing thanthe “integral” terms. For the z-direction kinetic energy,we treat the basis functions as being smooth functionsof z, and think of the slices as sampling those func-tions. Thus we use standard finite di↵erence formulas,defined via �nn0 . One could take a second order ap-proximation for �, with nonzero terms �nn = �2 and�n,n+1 = �n+1,n = 1. However, to reduce the grid er-ror to a4 we use a fourth-order approximation. For the“integral” terms, we think of the basis functions as beingcompletely localized and nonoverlapping between slices,i.e. 'nj(r) = �

12 (z � zn)�j(x, y). This corresponds to

taking

cnj� =p

a

Z

x,y�j(x, y) n�(x, y, zn) . (8)

and then transforming Eq. (2) accordingly. The distincttreatments of the terms means that the results are notstrictly variational at finite a. However, we find finite-aerrors for hydrogen chains of only about 0.1 mH per atomfor a = 0.1, and in the limit of a ! 0, the results arevariational.

In what follows, we construct the transverse basis func-tions on a slice {�j(x, y)} out of standard atom-centeredGaussian basis sets. We assume all the atoms are identi-cal. In going from the spherical symmetry used in stan-dard Gaussians to slices, we switch to cylindrical sym-metry. Thus, an S-function becomes a � function, P -functions become ⇡ functions, etc. Whereas there are

study challenging, correlated electron models in 2D

Figure 4: Charge and spin orders. Shown are sketches of the charge and spin orders in thewavelength 8 stripes from (A) DMET, (B) AFQMC, (C) iPEPS and (D) DMRG. The localmagnetic moments and hole densities are shown above and below the order plots, respectively.(Circle radius is proportional to hole density, arrow height is proportional to spin density). Thegray dashed lines represent the positions of maximum hole density and the magnetic domainwall. For more details, see (30).

14

Figure 4: Charge and spin orders. Shown are sketches of the charge and spin orders in thewavelength 8 stripes from (A) DMET, (B) AFQMC, (C) iPEPS and (D) DMRG. The localmagnetic moments and hole densities are shown above and below the order plots, respectively.(Circle radius is proportional to hole density, arrow height is proportional to spin density). Thegray dashed lines represent the positions of maximum hole density and the magnetic domainwall. For more details, see (30).

14

PHILIPPE CORBOZ PHYSICAL REVIEW B 94, 035133 (2016)

benchmark results for the 2D Heisenberg model, the Shastry-Sutherland model, and the t-J model to demonstrate theperformance of the variational approach compared to resultsbased on ITE. Finally, we discuss and summarize our findingsin Sec. V. In addition, in the Appendix we explain how toimplement a two-site variational optimization which can beused complementary to the one-site update discussed in themain text.

II. INTRODUCTION TO iPEPS

A. iPEPS ansatz

An iPEPS is an efficient variational tensor network ansatzfor 2D ground states of local Hamiltonians in the thermo-dynamic limit [5,7,8,11,36] which obey an area law of theentanglement entropy [10]. It consists of a rectangular unitcell of tensors with one tensor per lattice site A[x,y], where[x,y] label the coordinates of a tensor relative to the unitcell of size Lx ! Ly = NT , shown in Fig. 1(a). Each tensorhas one physical index carrying the local Hilbert space of alattice site and four auxiliary indices which connect to thenearest-neighbor tensors on a square lattice (more generally,a PEPS has z auxiliary indices, where z is the coordinationnumber of the lattice). The accuracy of the ansatz can besystematically controlled by the bond dimension D of theauxiliary indices.

For a translationally invariant state an ansatz with asingle-tensor unit cell can be chosen. However, if translationalsymmetry is spontaneously broken, a larger unit-cell size com-patible with the periodicity of the ground state is required (forexample, for an antiferromagnetic state two different tensorsfor the two sublattices are needed). Since the periodicity ofthe ground state is typically not known in advance, one has toperform simulations with different unit cell sizes to determinewhich cell size leads to the lowest variational energy. Usingdifferent unit cells also offers the possibility to find differentcompeting low-energy states (see, e.g., Ref. [21]).

B. Contraction of an iPEPS

In order to compute an expectation value of an observable Owith respect to an iPEPS wave function |!", the corresponding2D tensor network representing #!|O|!" has to be contractedin a controlled, approximate way. In this work we use avariant of the CTM renormalization group method [30–32]for arbitrary unit-cell sizes [21,36], which is summarized inthe following.

Consider the problem of computing the norm of an iPEPS#!|!", which boils down to contracting the infinite 2Dsquare-lattice network of the reduced tensors a[x,y], shown inFig. 1(c), where each a[x,y] is obtained from contracting A[x,y]

with its conjugate tensor A†[x,y] [see Fig. 1(b)]. The goal ofthe CTM approach is to compute the four corner tensors C1,C2, C3, C4 and the four edge tensors T1, T2, T3, T4 for eachcoordinate [x,y] in the unit cell, where each corner tensorrepresents a quadrant and the edge tensors represent a halfrow (or half column) of the infinite 2D network. All thesetensors together form the so-called environment, representingthe infinite system surrounding a bulk site (or several bulksites), as shown in Fig. 1(c). Once the environment has been

!|! =(c)

(a) xy 1 2 3 1 23

1

2

1

2

A[x,y]|! !

=A[x,y]

A†[x,y]

a[x,y]

C1 C2

C3C4

T1

T2

T3

T4 a

xx " 1 x + 1

y

y + 1

y " 1

(d)

=

=

C [x,y]1

C [x,y]4 =

P [x!1,y]

a[x,y]

P [x!1,y!1]

T [x,y]1

P [x!1,y]

C [x!1,y]1

T [x!1,y]3C [x!1,y]

4

P [x!1,y!1]

T[x,y]4

(b)

C1

C4

T1

T3

T4 a

xx " 1insert:

xx " 1absorb:

C1 · T1

T4 · a

C4 · T3

renormalize:

C1

T4

C4

x

T [x!1,y]4

(e)

FIG. 1. (a) iPEPS ansatz with a 3 ! 2 unit cell of tensors whichis periodically repeated in the lattice. (b) The reduced tensor a[x,y] isobtained from contracting an iPEPS tensor A[x,y] with its conjugateA†[x,y] along the physical leg. (c) The norm #!|!" is representedas an infinite square-lattice network of reduced tensors. The CTMapproach yields the environment tensors surrounding a bulk tensora[x,y] where the corner tensors C1, C2, C3, C4 take into account aquarter-infinite system and the edge tensors T1, T2, T3, T4 take intoaccount an infinite half row or half column of the system. (d) A leftmove is done by inserting a new column of tensors, multiplying thetensors to the left, and performing a renormalization step (this is donefor all coordinates y). (e) Diagrams to compute the updated cornerand edge tensors C $

1, C $4, T $

4 at coordinate x (for all y coordinates).Note that the coordinates are always taken modulo the unit cell size.

035133-2

PHILIPPE CORBOZ PHYSICAL REVIEW B 94, 035133 (2016)

benchmark results for the 2D Heisenberg model, the Shastry-Sutherland model, and the t-J model to demonstrate theperformance of the variational approach compared to resultsbased on ITE. Finally, we discuss and summarize our findingsin Sec. V. In addition, in the Appendix we explain how toimplement a two-site variational optimization which can beused complementary to the one-site update discussed in themain text.

II. INTRODUCTION TO iPEPS

A. iPEPS ansatz

An iPEPS is an efficient variational tensor network ansatzfor 2D ground states of local Hamiltonians in the thermo-dynamic limit [5,7,8,11,36] which obey an area law of theentanglement entropy [10]. It consists of a rectangular unitcell of tensors with one tensor per lattice site A[x,y], where[x,y] label the coordinates of a tensor relative to the unitcell of size Lx ! Ly = NT , shown in Fig. 1(a). Each tensorhas one physical index carrying the local Hilbert space of alattice site and four auxiliary indices which connect to thenearest-neighbor tensors on a square lattice (more generally,a PEPS has z auxiliary indices, where z is the coordinationnumber of the lattice). The accuracy of the ansatz can besystematically controlled by the bond dimension D of theauxiliary indices.

For a translationally invariant state an ansatz with asingle-tensor unit cell can be chosen. However, if translationalsymmetry is spontaneously broken, a larger unit-cell size com-patible with the periodicity of the ground state is required (forexample, for an antiferromagnetic state two different tensorsfor the two sublattices are needed). Since the periodicity ofthe ground state is typically not known in advance, one has toperform simulations with different unit cell sizes to determinewhich cell size leads to the lowest variational energy. Usingdifferent unit cells also offers the possibility to find differentcompeting low-energy states (see, e.g., Ref. [21]).

B. Contraction of an iPEPS

In order to compute an expectation value of an observable Owith respect to an iPEPS wave function |!", the corresponding2D tensor network representing #!|O|!" has to be contractedin a controlled, approximate way. In this work we use avariant of the CTM renormalization group method [30–32]for arbitrary unit-cell sizes [21,36], which is summarized inthe following.

Consider the problem of computing the norm of an iPEPS#!|!", which boils down to contracting the infinite 2Dsquare-lattice network of the reduced tensors a[x,y], shown inFig. 1(c), where each a[x,y] is obtained from contracting A[x,y]

with its conjugate tensor A†[x,y] [see Fig. 1(b)]. The goal ofthe CTM approach is to compute the four corner tensors C1,C2, C3, C4 and the four edge tensors T1, T2, T3, T4 for eachcoordinate [x,y] in the unit cell, where each corner tensorrepresents a quadrant and the edge tensors represent a halfrow (or half column) of the infinite 2D network. All thesetensors together form the so-called environment, representingthe infinite system surrounding a bulk site (or several bulksites), as shown in Fig. 1(c). Once the environment has been

!|! =(c)

(a) xy 1 2 3 1 23

1

2

1

2

A[x,y]|! !

=A[x,y]

A†[x,y]

a[x,y]

C1 C2

C3C4

T1

T2

T3

T4 a

xx " 1 x + 1

y

y + 1

y " 1

(d)

=

=

C [x,y]1

C [x,y]4 =

P [x!1,y]

a[x,y]

P [x!1,y!1]

T [x,y]1

P [x!1,y]

C [x!1,y]1

T [x!1,y]3C [x!1,y]

4

P [x!1,y!1]

T[x,y]4

(b)

C1

C4

T1

T3

T4 a

xx " 1insert:

xx " 1absorb:

C1 · T1

T4 · a

C4 · T3

renormalize:

C1

T4

C4

x

T [x!1,y]4

(e)

FIG. 1. (a) iPEPS ansatz with a 3 ! 2 unit cell of tensors whichis periodically repeated in the lattice. (b) The reduced tensor a[x,y] isobtained from contracting an iPEPS tensor A[x,y] with its conjugateA†[x,y] along the physical leg. (c) The norm #!|!" is representedas an infinite square-lattice network of reduced tensors. The CTMapproach yields the environment tensors surrounding a bulk tensora[x,y] where the corner tensors C1, C2, C3, C4 take into account aquarter-infinite system and the edge tensors T1, T2, T3, T4 take intoaccount an infinite half row or half column of the system. (d) A leftmove is done by inserting a new column of tensors, multiplying thetensors to the left, and performing a renormalization step (this is donefor all coordinates y). (e) Diagrams to compute the updated cornerand edge tensors C $

1, C $4, T $

4 at coordinate x (for all y coordinates).Note that the coordinates are always taken modulo the unit cell size.

035133-2

Corboz, PRB 94, 035133 (2016) Zheng et al., Science 358, 1155 (2017)

Stoudenmire, White, PRL 119, 046401 (2017)

Page 69: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Are tensor networks only useful for wavefunctions? Why not other tensors too?

Actually, since late 2000's, applied math community exploring tensor networks (Hackbusch, Oseledets,...)

Page 70: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Tensor networks a general tool for linear algebra in exponentially high-dimensional spaces

Notions like entanglement entropy correspond to multilinear tensor rank

Page 71: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

MPS and other tensor networks already being explored for machine learning:

...and many others

x1<latexit sha1_base64="FedYOKFDb3cBiGr2oJh8QJo2ugY=">AAACuHicdVFNSwMxEE3X7/qtRy/BRfBUdougIojgxaOiVaG7lNl0uoYm2SXJqmXtT/CqV/+W/8Z07UFtHQg83ryXN8kkueDGBsFnzZuZnZtfWFyqL6+srq1vbG7dmqzQDFssE5m+T8Cg4ApblluB97lGkInAu6R/PurfPaI2PFM3dpBjLCFVvMcZWEddP3fCzoYfNIKq6CQIx8An47rsbNY+om7GConKMgHGtMMgt3EJ2nImcFiPCoM5sD6k2HZQgUQTl9WsQ7rnmC7tZdodZWnF/nSUII0ZyMQpJdgH87c3Iqf12oXtHcUlV3lhUbHvoF4hqM3o6OG0yzUyKwYOANPczUrZA2hg1n1PPVL4xDIpQXXLqI922A7jMkJlCo2jrPLFDyMNKnUPHP5WJxom1JGopH74MkVdXd+caqDO4TfpP0nwmP6X5Iw/XG6n4d8NToJWs3HcCK4O/LPT8XIXyQ7ZJfskJIfkjFyQS9IijKTklbyRd+/EAy/1+LfUq4092+RXefoLZeHcVQ==</latexit><latexit sha1_base64="FedYOKFDb3cBiGr2oJh8QJo2ugY=">AAACuHicdVFNSwMxEE3X7/qtRy/BRfBUdougIojgxaOiVaG7lNl0uoYm2SXJqmXtT/CqV/+W/8Z07UFtHQg83ryXN8kkueDGBsFnzZuZnZtfWFyqL6+srq1vbG7dmqzQDFssE5m+T8Cg4ApblluB97lGkInAu6R/PurfPaI2PFM3dpBjLCFVvMcZWEddP3fCzoYfNIKq6CQIx8An47rsbNY+om7GConKMgHGtMMgt3EJ2nImcFiPCoM5sD6k2HZQgUQTl9WsQ7rnmC7tZdodZWnF/nSUII0ZyMQpJdgH87c3Iqf12oXtHcUlV3lhUbHvoF4hqM3o6OG0yzUyKwYOANPczUrZA2hg1n1PPVL4xDIpQXXLqI922A7jMkJlCo2jrPLFDyMNKnUPHP5WJxom1JGopH74MkVdXd+caqDO4TfpP0nwmP6X5Iw/XG6n4d8NToJWs3HcCK4O/LPT8XIXyQ7ZJfskJIfkjFyQS9IijKTklbyRd+/EAy/1+LfUq4092+RXefoLZeHcVQ==</latexit><latexit sha1_base64="FedYOKFDb3cBiGr2oJh8QJo2ugY=">AAACuHicdVFNSwMxEE3X7/qtRy/BRfBUdougIojgxaOiVaG7lNl0uoYm2SXJqmXtT/CqV/+W/8Z07UFtHQg83ryXN8kkueDGBsFnzZuZnZtfWFyqL6+srq1vbG7dmqzQDFssE5m+T8Cg4ApblluB97lGkInAu6R/PurfPaI2PFM3dpBjLCFVvMcZWEddP3fCzoYfNIKq6CQIx8An47rsbNY+om7GConKMgHGtMMgt3EJ2nImcFiPCoM5sD6k2HZQgUQTl9WsQ7rnmC7tZdodZWnF/nSUII0ZyMQpJdgH87c3Iqf12oXtHcUlV3lhUbHvoF4hqM3o6OG0yzUyKwYOANPczUrZA2hg1n1PPVL4xDIpQXXLqI922A7jMkJlCo2jrPLFDyMNKnUPHP5WJxom1JGopH74MkVdXd+caqDO4TfpP0nwmP6X5Iw/XG6n4d8NToJWs3HcCK4O/LPT8XIXyQ7ZJfskJIfkjFyQS9IijKTklbyRd+/EAy/1+LfUq4092+RXefoLZeHcVQ==</latexit>

x2<latexit sha1_base64="W0a/NhSG5s1nLxQWD+he2qxFls0=">AAACuHicdVFNSwMxEE3X7/qtRy/BRfBUdougIojgxaOiVaG7lNl0uoYm2SXJqmXtT/CqV/+W/8Z07UFtHQg83ryXN8kkueDGBsFnzZuZnZtfWFyqL6+srq1vbG7dmqzQDFssE5m+T8Cg4ApblluB97lGkInAu6R/PurfPaI2PFM3dpBjLCFVvMcZWEddP3eanQ0/aARV0UkQjoFPxnXZ2ax9RN2MFRKVZQKMaYdBbuMStOVM4LAeFQZzYH1Ise2gAokmLqtZh3TPMV3ay7Q7ytKK/ekoQRozkIlTSrAP5m9vRE7rtQvbO4pLrvLComLfQb1CUJvR0cNpl2tkVgwcAKa5m5WyB9DArPueeqTwiWVSguqWUR/tsB3GZYTKFBpHWeWLH0YaVOoeOPytTjRMqCNRSf3wZYq6ur451UCdw2/Sf5LgMf0vyRl/uNxOw78bnAStZuO4EVwd+Gen4+Uukh2yS/ZJSA7JGbkgl6RFGEnJK3kj796JB17q8W+pVxt7tsmv8vQXaDPcVg==</latexit><latexit sha1_base64="W0a/NhSG5s1nLxQWD+he2qxFls0=">AAACuHicdVFNSwMxEE3X7/qtRy/BRfBUdougIojgxaOiVaG7lNl0uoYm2SXJqmXtT/CqV/+W/8Z07UFtHQg83ryXN8kkueDGBsFnzZuZnZtfWFyqL6+srq1vbG7dmqzQDFssE5m+T8Cg4ApblluB97lGkInAu6R/PurfPaI2PFM3dpBjLCFVvMcZWEddP3eanQ0/aARV0UkQjoFPxnXZ2ax9RN2MFRKVZQKMaYdBbuMStOVM4LAeFQZzYH1Ise2gAokmLqtZh3TPMV3ay7Q7ytKK/ekoQRozkIlTSrAP5m9vRE7rtQvbO4pLrvLComLfQb1CUJvR0cNpl2tkVgwcAKa5m5WyB9DArPueeqTwiWVSguqWUR/tsB3GZYTKFBpHWeWLH0YaVOoeOPytTjRMqCNRSf3wZYq6ur451UCdw2/Sf5LgMf0vyRl/uNxOw78bnAStZuO4EVwd+Gen4+Uukh2yS/ZJSA7JGbkgl6RFGEnJK3kj796JB17q8W+pVxt7tsmv8vQXaDPcVg==</latexit><latexit sha1_base64="W0a/NhSG5s1nLxQWD+he2qxFls0=">AAACuHicdVFNSwMxEE3X7/qtRy/BRfBUdougIojgxaOiVaG7lNl0uoYm2SXJqmXtT/CqV/+W/8Z07UFtHQg83ryXN8kkueDGBsFnzZuZnZtfWFyqL6+srq1vbG7dmqzQDFssE5m+T8Cg4ApblluB97lGkInAu6R/PurfPaI2PFM3dpBjLCFVvMcZWEddP3eanQ0/aARV0UkQjoFPxnXZ2ax9RN2MFRKVZQKMaYdBbuMStOVM4LAeFQZzYH1Ise2gAokmLqtZh3TPMV3ay7Q7ytKK/ekoQRozkIlTSrAP5m9vRE7rtQvbO4pLrvLComLfQb1CUJvR0cNpl2tkVgwcAKa5m5WyB9DArPueeqTwiWVSguqWUR/tsB3GZYTKFBpHWeWLH0YaVOoeOPytTjRMqCNRSf3wZYq6ur451UCdw2/Sf5LgMf0vyRl/uNxOw78bnAStZuO4EVwd+Gen4+Uukh2yS/ZJSA7JGbkgl6RFGEnJK3kj796JB17q8W+pVxt7tsmv8vQXaDPcVg==</latexit>

x3<latexit sha1_base64="a9UKL/2cnCMDla+QUCRBfwdp0GQ=">AAACuHicdVFNTxsxEHW2LdDQQmiPXKyuKnGKdlMkQEgoEpceQTQNUnYVzTqTxYrtXdleINrkJ3CFa/9W/w3OkgP5GsnS05v3/MaeJBfc2CD4X/M+fPy0tb3zub775evefuPg21+TFZphh2Ui07cJGBRcYcdyK/A21wgyEdhNRpezfvceteGZ+mPHOcYSUsWHnIF11M1j/1e/4QfNoCq6CsI58Mm8rvoHtX/RIGOFRGWZAGN6YZDbuARtORM4rUeFwRzYCFLsOahAoonLatYp/emYAR1m2h1lacW+d5QgjRnLxCkl2Duz3JuR63q9wg5P45KrvLCo2FvQsBDUZnT2cDrgGpkVYweAae5mpewONDDrvqceKXxgmZSgBmU0QjvthXEZoTKFxllWOfHDSINK3QOni+pEw4o6EpXUDydr1NX1rbUG6hx+i25Igvt0U5IzvnO5nYbLG1wFnVbzrBlcH/vti/lyd8gh+UGOSEhOSJv8JlekQxhJyRN5Ji/euQde6vE3qVebe76ThfL0K2qF3Fc=</latexit><latexit sha1_base64="a9UKL/2cnCMDla+QUCRBfwdp0GQ=">AAACuHicdVFNTxsxEHW2LdDQQmiPXKyuKnGKdlMkQEgoEpceQTQNUnYVzTqTxYrtXdleINrkJ3CFa/9W/w3OkgP5GsnS05v3/MaeJBfc2CD4X/M+fPy0tb3zub775evefuPg21+TFZphh2Ui07cJGBRcYcdyK/A21wgyEdhNRpezfvceteGZ+mPHOcYSUsWHnIF11M1j/1e/4QfNoCq6CsI58Mm8rvoHtX/RIGOFRGWZAGN6YZDbuARtORM4rUeFwRzYCFLsOahAoonLatYp/emYAR1m2h1lacW+d5QgjRnLxCkl2Duz3JuR63q9wg5P45KrvLCo2FvQsBDUZnT2cDrgGpkVYweAae5mpewONDDrvqceKXxgmZSgBmU0QjvthXEZoTKFxllWOfHDSINK3QOni+pEw4o6EpXUDydr1NX1rbUG6hx+i25Igvt0U5IzvnO5nYbLG1wFnVbzrBlcH/vti/lyd8gh+UGOSEhOSJv8JlekQxhJyRN5Ji/euQde6vE3qVebe76ThfL0K2qF3Fc=</latexit><latexit sha1_base64="a9UKL/2cnCMDla+QUCRBfwdp0GQ=">AAACuHicdVFNTxsxEHW2LdDQQmiPXKyuKnGKdlMkQEgoEpceQTQNUnYVzTqTxYrtXdleINrkJ3CFa/9W/w3OkgP5GsnS05v3/MaeJBfc2CD4X/M+fPy0tb3zub775evefuPg21+TFZphh2Ui07cJGBRcYcdyK/A21wgyEdhNRpezfvceteGZ+mPHOcYSUsWHnIF11M1j/1e/4QfNoCq6CsI58Mm8rvoHtX/RIGOFRGWZAGN6YZDbuARtORM4rUeFwRzYCFLsOahAoonLatYp/emYAR1m2h1lacW+d5QgjRnLxCkl2Duz3JuR63q9wg5P45KrvLCo2FvQsBDUZnT2cDrgGpkVYweAae5mpewONDDrvqceKXxgmZSgBmU0QjvthXEZoTKFxllWOfHDSINK3QOni+pEw4o6EpXUDydr1NX1rbUG6hx+i25Igvt0U5IzvnO5nYbLG1wFnVbzrBlcH/vti/lyd8gh+UGOSEhOSJv8JlekQxhJyRN5Ji/euQde6vE3qVebe76ThfL0K2qF3Fc=</latexit>

x4<latexit sha1_base64="0hCzLPDj61quhD03PTcBJHYacqQ=">AAACuHicdVFNSwMxEE3X7/qtRy/BRfBUdougIojgxaOiVaG7lNl0uoYm2SXJqmXtT/CqV/+W/8Z07UFtHQg83ryXN8kkueDGBsFnzZuZnZtfWFyqL6+srq1vbG7dmqzQDFssE5m+T8Cg4ApblluB97lGkInAu6R/PurfPaI2PFM3dpBjLCFVvMcZWEddP3cOOht+0AiqopMgHAOfjOuys1n7iLoZKyQqywQY0w6D3MYlaMuZwGE9KgzmwPqQYttBBRJNXFazDumeY7q0l2l3lKUV+9NRgjRmIBOnlGAfzN/eiJzWaxe2dxSXXOWFRcW+g3qFoDajo4fTLtfIrBg4AExzNytlD6CBWfc99UjhE8ukBNUtoz7aYTuMywiVKTSOssoXP4w0qNQ9cPhbnWiYUEeikvrhyxR1dX1zqoE6h9+k/yTBY/pfkjP+cLmdhn83OAlazcZxI7g68M9Ox8tdJDtkl+yTkBySM3JBLkmLMJKSV/JG3r0TD7zU499Srzb2bJNf5ekvbNfcWA==</latexit><latexit sha1_base64="0hCzLPDj61quhD03PTcBJHYacqQ=">AAACuHicdVFNSwMxEE3X7/qtRy/BRfBUdougIojgxaOiVaG7lNl0uoYm2SXJqmXtT/CqV/+W/8Z07UFtHQg83ryXN8kkueDGBsFnzZuZnZtfWFyqL6+srq1vbG7dmqzQDFssE5m+T8Cg4ApblluB97lGkInAu6R/PurfPaI2PFM3dpBjLCFVvMcZWEddP3cOOht+0AiqopMgHAOfjOuys1n7iLoZKyQqywQY0w6D3MYlaMuZwGE9KgzmwPqQYttBBRJNXFazDumeY7q0l2l3lKUV+9NRgjRmIBOnlGAfzN/eiJzWaxe2dxSXXOWFRcW+g3qFoDajo4fTLtfIrBg4AExzNytlD6CBWfc99UjhE8ukBNUtoz7aYTuMywiVKTSOssoXP4w0qNQ9cPhbnWiYUEeikvrhyxR1dX1zqoE6h9+k/yTBY/pfkjP+cLmdhn83OAlazcZxI7g68M9Ox8tdJDtkl+yTkBySM3JBLkmLMJKSV/JG3r0TD7zU499Srzb2bJNf5ekvbNfcWA==</latexit><latexit sha1_base64="0hCzLPDj61quhD03PTcBJHYacqQ=">AAACuHicdVFNSwMxEE3X7/qtRy/BRfBUdougIojgxaOiVaG7lNl0uoYm2SXJqmXtT/CqV/+W/8Z07UFtHQg83ryXN8kkueDGBsFnzZuZnZtfWFyqL6+srq1vbG7dmqzQDFssE5m+T8Cg4ApblluB97lGkInAu6R/PurfPaI2PFM3dpBjLCFVvMcZWEddP3cOOht+0AiqopMgHAOfjOuys1n7iLoZKyQqywQY0w6D3MYlaMuZwGE9KgzmwPqQYttBBRJNXFazDumeY7q0l2l3lKUV+9NRgjRmIBOnlGAfzN/eiJzWaxe2dxSXXOWFRcW+g3qFoDajo4fTLtfIrBg4AExzNytlD6CBWfc99UjhE8ukBNUtoz7aYTuMywiVKTSOssoXP4w0qNQ9cPhbnWiYUEeikvrhyxR1dX1zqoE6h9+k/yTBY/pfkjP+cLmdhn83OAlazcZxI7g68M9Ox8tdJDtkl+yTkBySM3JBLkmLMJKSV/JG3r0TD7zU499Srzb2bJNf5ekvbNfcWA==</latexit>

x5<latexit sha1_base64="A6I0K65uRn1/5YzQiT3xf7CHcEI=">AAACuHicdVFNTxsxEHW2LdDQQmiPXKyuKnGKdqMiQEgoEpceQTQNUnYVzTqTxYrtXdleINrkJ3CFa/9W/w3OkgP5GsnS05v3/MaeJBfc2CD4X/M+fPy0tb3zub775evefuPg21+TFZphh2Ui07cJGBRcYcdyK/A21wgyEdhNRpezfvceteGZ+mPHOcYSUsWHnIF11M1j/7jf8INmUBVdBeEc+GReV/2D2r9okLFCorJMgDG9MMhtXIK2nAmc1qPCYA5sBCn2HFQg0cRlNeuU/nTMgA4z7Y6ytGLfO0qQxoxl4pQS7J1Z7s3Idb1eYYencclVXlhU7C1oWAhqMzp7OB1wjcyKsQPANHezUnYHGph131OPFD6wTEpQgzIaoZ32wriMUJlC4yyrnPhhpEGl7oHTRXWiYUUdiUrqh5M16ur61loDdQ6/RTckwX26KckZ37ncTsPlDa6CTqt51gyuf/nti/lyd8gh+UGOSEhOSJv8JlekQxhJyRN5Ji/euQde6vE3qVebe76ThfL0K28p3Fk=</latexit><latexit sha1_base64="A6I0K65uRn1/5YzQiT3xf7CHcEI=">AAACuHicdVFNTxsxEHW2LdDQQmiPXKyuKnGKdqMiQEgoEpceQTQNUnYVzTqTxYrtXdleINrkJ3CFa/9W/w3OkgP5GsnS05v3/MaeJBfc2CD4X/M+fPy0tb3zub775evefuPg21+TFZphh2Ui07cJGBRcYcdyK/A21wgyEdhNRpezfvceteGZ+mPHOcYSUsWHnIF11M1j/7jf8INmUBVdBeEc+GReV/2D2r9okLFCorJMgDG9MMhtXIK2nAmc1qPCYA5sBCn2HFQg0cRlNeuU/nTMgA4z7Y6ytGLfO0qQxoxl4pQS7J1Z7s3Idb1eYYencclVXlhU7C1oWAhqMzp7OB1wjcyKsQPANHezUnYHGph131OPFD6wTEpQgzIaoZ32wriMUJlC4yyrnPhhpEGl7oHTRXWiYUUdiUrqh5M16ur61loDdQ6/RTckwX26KckZ37ncTsPlDa6CTqt51gyuf/nti/lyd8gh+UGOSEhOSJv8JlekQxhJyRN5Ji/euQde6vE3qVebe76ThfL0K28p3Fk=</latexit><latexit sha1_base64="A6I0K65uRn1/5YzQiT3xf7CHcEI=">AAACuHicdVFNTxsxEHW2LdDQQmiPXKyuKnGKdqMiQEgoEpceQTQNUnYVzTqTxYrtXdleINrkJ3CFa/9W/w3OkgP5GsnS05v3/MaeJBfc2CD4X/M+fPy0tb3zub775evefuPg21+TFZphh2Ui07cJGBRcYcdyK/A21wgyEdhNRpezfvceteGZ+mPHOcYSUsWHnIF11M1j/7jf8INmUBVdBeEc+GReV/2D2r9okLFCorJMgDG9MMhtXIK2nAmc1qPCYA5sBCn2HFQg0cRlNeuU/nTMgA4z7Y6ytGLfO0qQxoxl4pQS7J1Z7s3Idb1eYYencclVXlhU7C1oWAhqMzp7OB1wjcyKsQPANHezUnYHGph131OPFD6wTEpQgzIaoZ32wriMUJlC4yyrnPhhpEGl7oHTRXWiYUUdiUrqh5M16ur61loDdQ6/RTckwX26KckZ37ncTsPlDa6CTqt51gyuf/nti/lyd8gh+UGOSEhOSJv8JlekQxhJyRN5Ji/euQde6vE3qVebe76ThfL0K28p3Fk=</latexit>

x6<latexit sha1_base64="KfwcWg5SM1D4GHDfUYsRS3ymOFw=">AAACuHicdVFNTxsxEHW2LdDQQmiPXKyuKnGKdqOKDyGhSFx6BNE0SNlVNOtMFiu2d2V7gWiTn8AVrv1b/Tc4Sw7kayRLT2/e8xt7klxwY4Pgf8378PHT1vbO5/rul697+42Db39NVmiGHZaJTN8mYFBwhR3LrcDbXCPIRGA3GV3O+t171IZn6o8d5xhLSBUfcgbWUTeP/eN+ww+aQVV0FYRz4JN5XfUPav+iQcYKicoyAcb0wiC3cQnaciZwWo8KgzmwEaTYc1CBRBOX1axT+tMxAzrMtDvK0op97yhBGjOWiVNKsHdmuTcj1/V6hR2exiVXeWFRsbegYSGozejs4XTANTIrxg4A09zNStkdaGDWfU89UvjAMilBDcpohHbaC+MyQmUKjbOscuKHkQaVugdOF9WJhhV1JCqpH07WqKvrW2sN1Dn8Ft2QBPfppiRnfOdyOw2XN7gKOq3mWTO4/uW3L+bL3SGH5Ac5IiE5IW3ym1yRDmEkJU/kmbx45x54qcffpF5t7vlOFsrTr3F73Fo=</latexit><latexit sha1_base64="KfwcWg5SM1D4GHDfUYsRS3ymOFw=">AAACuHicdVFNTxsxEHW2LdDQQmiPXKyuKnGKdqOKDyGhSFx6BNE0SNlVNOtMFiu2d2V7gWiTn8AVrv1b/Tc4Sw7kayRLT2/e8xt7klxwY4Pgf8378PHT1vbO5/rul697+42Db39NVmiGHZaJTN8mYFBwhR3LrcDbXCPIRGA3GV3O+t171IZn6o8d5xhLSBUfcgbWUTeP/eN+ww+aQVV0FYRz4JN5XfUPav+iQcYKicoyAcb0wiC3cQnaciZwWo8KgzmwEaTYc1CBRBOX1axT+tMxAzrMtDvK0op97yhBGjOWiVNKsHdmuTcj1/V6hR2exiVXeWFRsbegYSGozejs4XTANTIrxg4A09zNStkdaGDWfU89UvjAMilBDcpohHbaC+MyQmUKjbOscuKHkQaVugdOF9WJhhV1JCqpH07WqKvrW2sN1Dn8Ft2QBPfppiRnfOdyOw2XN7gKOq3mWTO4/uW3L+bL3SGH5Ac5IiE5IW3ym1yRDmEkJU/kmbx45x54qcffpF5t7vlOFsrTr3F73Fo=</latexit><latexit sha1_base64="KfwcWg5SM1D4GHDfUYsRS3ymOFw=">AAACuHicdVFNTxsxEHW2LdDQQmiPXKyuKnGKdqOKDyGhSFx6BNE0SNlVNOtMFiu2d2V7gWiTn8AVrv1b/Tc4Sw7kayRLT2/e8xt7klxwY4Pgf8378PHT1vbO5/rul697+42Db39NVmiGHZaJTN8mYFBwhR3LrcDbXCPIRGA3GV3O+t171IZn6o8d5xhLSBUfcgbWUTeP/eN+ww+aQVV0FYRz4JN5XfUPav+iQcYKicoyAcb0wiC3cQnaciZwWo8KgzmwEaTYc1CBRBOX1axT+tMxAzrMtDvK0op97yhBGjOWiVNKsHdmuTcj1/V6hR2exiVXeWFRsbegYSGozejs4XTANTIrxg4A09zNStkdaGDWfU89UvjAMilBDcpohHbaC+MyQmUKjbOscuKHkQaVugdOF9WJhhV1JCqpH07WqKvrW2sN1Dn8Ft2QBPfppiRnfOdyOw2XN7gKOq3mWTO4/uW3L+bL3SGH5Ac5IiE5IW3ym1yRDmEkJU/kmbx45x54qcffpF5t7vlOFsrTr3F73Fo=</latexit>

p(x) =<latexit sha1_base64="OtSLSwa9oF52bfz3eGobgOif5fw=">AAACx3icdVFNT9tAEN24fDUFmpRjLyusSnCJ7Aip7QEJiQvqiUpNQYqtaLwZh1V219buOhAZH/gfHLjQ/8S/6drkACSMtNLTm/f2ze4kueDGBsFTy/uwtr6xufWx/Wl7Z/dzp/vlr8kKzXDAMpHpywQMCq5wYLkVeJlrBJkIvEimp3X/Yoba8Ez9sfMcYwkTxVPOwDpq1OnkB5EEe5Wk5U11SI/pqOMHvaApugzCBfDJos5H3dZDNM5YIVFZJsCYYRjkNi5BW84EVu2oMJgDm8IEhw4qkGjishm9ot8cM6Zppt1RljbsS0cJ0pi5TJyyHtO87dXkqt6wsOmPuOQqLywq9hyUFoLajNb/QMdcI7Ni7gAwzd2slF2BBmbdb7UjhdcskxLUuIymaKthGJcRKlNorLPKWz+MNKiJe2D1Wp1oWFJHopH64e0KdXN9f6WBOoffp+8kwWzyXpIzvnC5nYZvN7gMBv3ez17w+8g/OV0sd4t8JfvkgITkOzkhZ+ScDAgjM3JPHsk/75eXezPv5lnqtRaePfKqvLv/ZIHhgQ==</latexit><latexit sha1_base64="OtSLSwa9oF52bfz3eGobgOif5fw=">AAACx3icdVFNT9tAEN24fDUFmpRjLyusSnCJ7Aip7QEJiQvqiUpNQYqtaLwZh1V219buOhAZH/gfHLjQ/8S/6drkACSMtNLTm/f2ze4kueDGBsFTy/uwtr6xufWx/Wl7Z/dzp/vlr8kKzXDAMpHpywQMCq5wYLkVeJlrBJkIvEimp3X/Yoba8Ez9sfMcYwkTxVPOwDpq1OnkB5EEe5Wk5U11SI/pqOMHvaApugzCBfDJos5H3dZDNM5YIVFZJsCYYRjkNi5BW84EVu2oMJgDm8IEhw4qkGjishm9ot8cM6Zppt1RljbsS0cJ0pi5TJyyHtO87dXkqt6wsOmPuOQqLywq9hyUFoLajNb/QMdcI7Ni7gAwzd2slF2BBmbdb7UjhdcskxLUuIymaKthGJcRKlNorLPKWz+MNKiJe2D1Wp1oWFJHopH64e0KdXN9f6WBOoffp+8kwWzyXpIzvnC5nYZvN7gMBv3ez17w+8g/OV0sd4t8JfvkgITkOzkhZ+ScDAgjM3JPHsk/75eXezPv5lnqtRaePfKqvLv/ZIHhgQ==</latexit><latexit sha1_base64="OtSLSwa9oF52bfz3eGobgOif5fw=">AAACx3icdVFNT9tAEN24fDUFmpRjLyusSnCJ7Aip7QEJiQvqiUpNQYqtaLwZh1V219buOhAZH/gfHLjQ/8S/6drkACSMtNLTm/f2ze4kueDGBsFTy/uwtr6xufWx/Wl7Z/dzp/vlr8kKzXDAMpHpywQMCq5wYLkVeJlrBJkIvEimp3X/Yoba8Ez9sfMcYwkTxVPOwDpq1OnkB5EEe5Wk5U11SI/pqOMHvaApugzCBfDJos5H3dZDNM5YIVFZJsCYYRjkNi5BW84EVu2oMJgDm8IEhw4qkGjishm9ot8cM6Zppt1RljbsS0cJ0pi5TJyyHtO87dXkqt6wsOmPuOQqLywq9hyUFoLajNb/QMdcI7Ni7gAwzd2slF2BBmbdb7UjhdcskxLUuIymaKthGJcRKlNorLPKWz+MNKiJe2D1Wp1oWFJHopH64e0KdXN9f6WBOoffp+8kwWzyXpIzvnC5nYZvN7gMBv3ez17w+8g/OV0sd4t8JfvkgITkOzkhZ+ScDAgjM3JPHsk/75eXezPv5lnqtRaePfKqvLv/ZIHhgQ==</latexit>

2<latexit sha1_base64="a611A2aYkSyYekr8vJuYwEL0Bmw=">AAACtnicdVFNSwMxEE3X7/qtRy/BRfBUdoug3gQvHhWsFrpLmU2na2iSXZKsUrb9BV717t/y35hue1DbDgQeb97Lm2SSXHBjg+C75q2srq1vbG7Vt3d29/YPDo+eTFZohi2WiUy3EzAouMKW5VZgO9cIMhH4nAxuJ/3nV9SGZ+rRDnOMJaSK9zkD66iHZvfADxpBVXQehDPgk1nddw9rX1EvY4VEZZkAYzphkNu4BG05EziuR4XBHNgAUuw4qECiictq0jE9c0yP9jPtjrK0Yn87SpDGDGXilBLsi/nfm5CLep3C9q/ikqu8sKjYNKhfCGozOnk27XGNzIqhA8A0d7NS9gIamHWfU48UvrFMSlC9MhqgHXfCuIxQmULjJKsc+WGkQaXugeO/6kTDnDoSldQPRwvU1fXNhQbqHH6TLkmC13RZkjP+crmdhv83OA9azcZ1I3i48G9uZ8vdJCfklJyTkFySG3JH7kmLMILknXyQT+/a63ropVOpV5t5jsmf8vIfKgPbcA==</latexit><latexit sha1_base64="a611A2aYkSyYekr8vJuYwEL0Bmw=">AAACtnicdVFNSwMxEE3X7/qtRy/BRfBUdoug3gQvHhWsFrpLmU2na2iSXZKsUrb9BV717t/y35hue1DbDgQeb97Lm2SSXHBjg+C75q2srq1vbG7Vt3d29/YPDo+eTFZohi2WiUy3EzAouMKW5VZgO9cIMhH4nAxuJ/3nV9SGZ+rRDnOMJaSK9zkD66iHZvfADxpBVXQehDPgk1nddw9rX1EvY4VEZZkAYzphkNu4BG05EziuR4XBHNgAUuw4qECiictq0jE9c0yP9jPtjrK0Yn87SpDGDGXilBLsi/nfm5CLep3C9q/ikqu8sKjYNKhfCGozOnk27XGNzIqhA8A0d7NS9gIamHWfU48UvrFMSlC9MhqgHXfCuIxQmULjJKsc+WGkQaXugeO/6kTDnDoSldQPRwvU1fXNhQbqHH6TLkmC13RZkjP+crmdhv83OA9azcZ1I3i48G9uZ8vdJCfklJyTkFySG3JH7kmLMILknXyQT+/a63ropVOpV5t5jsmf8vIfKgPbcA==</latexit><latexit sha1_base64="a611A2aYkSyYekr8vJuYwEL0Bmw=">AAACtnicdVFNSwMxEE3X7/qtRy/BRfBUdoug3gQvHhWsFrpLmU2na2iSXZKsUrb9BV717t/y35hue1DbDgQeb97Lm2SSXHBjg+C75q2srq1vbG7Vt3d29/YPDo+eTFZohi2WiUy3EzAouMKW5VZgO9cIMhH4nAxuJ/3nV9SGZ+rRDnOMJaSK9zkD66iHZvfADxpBVXQehDPgk1nddw9rX1EvY4VEZZkAYzphkNu4BG05EziuR4XBHNgAUuw4qECiictq0jE9c0yP9jPtjrK0Yn87SpDGDGXilBLsi/nfm5CLep3C9q/ikqu8sKjYNKhfCGozOnk27XGNzIqhA8A0d7NS9gIamHWfU48UvrFMSlC9MhqgHXfCuIxQmULjJKsc+WGkQaXugeO/6kTDnDoSldQPRwvU1fXNhQbqHH6TLkmC13RZkjP+crmdhv83OA9azcZ1I3i48G9uZ8vdJCfklJyTkFySG3JH7kmLMILknXyQT+/a63ropVOpV5t5jsmf8vIfKgPbcA==</latexit>

unsupervised learning

Bengua, Phien, Tuan, 10.1109/BigDataCongress.2015.105 (2015)

Cichocki et al, dx.doi.org/10.1561/2200000067 (2017)

Han, et al, Phys. Rev. X 8, 031012 (2018)

Stokes, Terilla, arxiv:1902.06888

} !Novikov et al., Advances in NIPS (2015), arxiv:1509.06569

Garipov, Podoprikhin, Novikov, arxiv:1611.03214

compressing neural nets

Yu, Zheng, Anandkumar, , arxiv:1711.00073

Miller, et al., arxiv:2003.01039

supervised learning

Novikov, Trofimov, Oseledets, arxiv:1605.03795Stoudenmire, Schwab, Advances in N.I.P.S., 29, 4799

I. Glasser, N. Pancotti, J.I. Cirac, arxiv:1806.05964

Efthymiou, et al., arxiv:1906.06329Selvan, et al., arxiv:2004.10076

Page 72: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Compressing Neural Network Weight Layers

} !Novikov et al., Advances in Neural Information Processing (2015) (arxiv:1509.06569)

Garipov, Podoprikhin, Novikov, arxiv:1611.03214

• Train very "wide" model: 262,144 hidden units • Achieve 80x compression, only 1% accuracy loss

Hallam, Andrew, et al. "Compact neural networks based on the multiscale entanglement renormalization ansatz." arXiv:1711.03357

Rose Yu, Stephan Zheng, Anima Anandkumar, Yisong Yue, "Long-term Forecasting using Tensor-Train RNNs", arXiv:1711.00073

Page 73: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Framework where tensor network plays central role?

W

• Can natural images be more complex than wavefunctions?

• Import many ideas, algorithms from physics

• Improve tensor network methods

Motivation:

Page 74: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

MPS and Tensor Networks for General Data

Page 75: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Ingredients for machine learning:

{xj , yj}<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

minf

1

|T |

|T |X

j=1

⇣f(xj)� yj

⌘2

<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

f(x)<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

Data ( + labels):

Objective / cost function:

Class of model functions:

Page 76: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

How to get a class of functions where a huge (order-N) tensor appears?

Consider a polynomial over N variables:

f(x1, x2, . . . , xN ) =X

n

Wn1n2···nNxn11 xn2

2 · · ·xnNN

<latexit sha1_base64="3xvt1rklVBeHb3XSnStd7JXjde0=">AAADIHicdVFNaxQxGM6OH63r11aPXoKDUKGUzSDoRSj04mmp4HYLm3HIZDK7YZPMkGTaLun8F/HXeFI86u/wqphMV2i76wuBJ89H3iRvXgtu7HD4oxfdun3n7tb2vf79Bw8fPR7sPDk2VaMpG9NKVPokJ4YJrtjYcivYSa0Zkblgk3xxGPTJKdOGV+qDXdYslWSmeMkpsZ7KBvNy9zxDe/A8S/YgFkVlTdiMXsK3EJtGZg5LYud56VTbwknmVIagyhKIafB6OGq9H30MQkBJQEn7T/dHBWLUZoN4uD/sCq4DtAIxWNVRttP7hIuKNpIpSwUxZoqGtU0d0ZZTwdo+bgyrCV2QGZt6qIhkJnXdl7TwhWcKWFbaL2Vhx15NOCKNWcrcO8P7zE0tkJu0aWPLN6njqm4sU/SyUdkIaCsY/hcWXDNqxdIDQjX3d4V0TjSh1k+hjxU7o5WURBUOL5htpyh1mCnTaBZ6uYsYYU3UzD+wve7ONVlzY9FZY3Sxwd0dn2wMQJ+I/Qw3dyKns/918sErKT9TdHOC6+A42Ucev38VHxyuprsNnoHnYBcg8BocgHfgCIwBBV/BL/Ab/Ik+R1+ib9H3S2vUW2WegmsV/fwLa5wDxQ==</latexit><latexit sha1_base64="3xvt1rklVBeHb3XSnStd7JXjde0=">AAADIHicdVFNaxQxGM6OH63r11aPXoKDUKGUzSDoRSj04mmp4HYLm3HIZDK7YZPMkGTaLun8F/HXeFI86u/wqphMV2i76wuBJ89H3iRvXgtu7HD4oxfdun3n7tb2vf79Bw8fPR7sPDk2VaMpG9NKVPokJ4YJrtjYcivYSa0Zkblgk3xxGPTJKdOGV+qDXdYslWSmeMkpsZ7KBvNy9zxDe/A8S/YgFkVlTdiMXsK3EJtGZg5LYud56VTbwknmVIagyhKIafB6OGq9H30MQkBJQEn7T/dHBWLUZoN4uD/sCq4DtAIxWNVRttP7hIuKNpIpSwUxZoqGtU0d0ZZTwdo+bgyrCV2QGZt6qIhkJnXdl7TwhWcKWFbaL2Vhx15NOCKNWcrcO8P7zE0tkJu0aWPLN6njqm4sU/SyUdkIaCsY/hcWXDNqxdIDQjX3d4V0TjSh1k+hjxU7o5WURBUOL5htpyh1mCnTaBZ6uYsYYU3UzD+wve7ONVlzY9FZY3Sxwd0dn2wMQJ+I/Qw3dyKns/918sErKT9TdHOC6+A42Ucev38VHxyuprsNnoHnYBcg8BocgHfgCIwBBV/BL/Ab/Ik+R1+ib9H3S2vUW2WegmsV/fwLa5wDxQ==</latexit><latexit sha1_base64="3xvt1rklVBeHb3XSnStd7JXjde0=">AAADIHicdVFNaxQxGM6OH63r11aPXoKDUKGUzSDoRSj04mmp4HYLm3HIZDK7YZPMkGTaLun8F/HXeFI86u/wqphMV2i76wuBJ89H3iRvXgtu7HD4oxfdun3n7tb2vf79Bw8fPR7sPDk2VaMpG9NKVPokJ4YJrtjYcivYSa0Zkblgk3xxGPTJKdOGV+qDXdYslWSmeMkpsZ7KBvNy9zxDe/A8S/YgFkVlTdiMXsK3EJtGZg5LYud56VTbwknmVIagyhKIafB6OGq9H30MQkBJQEn7T/dHBWLUZoN4uD/sCq4DtAIxWNVRttP7hIuKNpIpSwUxZoqGtU0d0ZZTwdo+bgyrCV2QGZt6qIhkJnXdl7TwhWcKWFbaL2Vhx15NOCKNWcrcO8P7zE0tkJu0aWPLN6njqm4sU/SyUdkIaCsY/hcWXDNqxdIDQjX3d4V0TjSh1k+hjxU7o5WURBUOL5htpyh1mCnTaBZ6uYsYYU3UzD+wve7ONVlzY9FZY3Sxwd0dn2wMQJ+I/Qw3dyKns/918sErKT9TdHOC6+A42Ucev38VHxyuprsNnoHnYBcg8BocgHfgCIwBBV/BL/Ab/Ik+R1+ib9H3S2vUW2WegmsV/fwLa5wDxQ==</latexit><latexit sha1_base64="3xvt1rklVBeHb3XSnStd7JXjde0=">AAADIHicdVFNaxQxGM6OH63r11aPXoKDUKGUzSDoRSj04mmp4HYLm3HIZDK7YZPMkGTaLun8F/HXeFI86u/wqphMV2i76wuBJ89H3iRvXgtu7HD4oxfdun3n7tb2vf79Bw8fPR7sPDk2VaMpG9NKVPokJ4YJrtjYcivYSa0Zkblgk3xxGPTJKdOGV+qDXdYslWSmeMkpsZ7KBvNy9zxDe/A8S/YgFkVlTdiMXsK3EJtGZg5LYud56VTbwknmVIagyhKIafB6OGq9H30MQkBJQEn7T/dHBWLUZoN4uD/sCq4DtAIxWNVRttP7hIuKNpIpSwUxZoqGtU0d0ZZTwdo+bgyrCV2QGZt6qIhkJnXdl7TwhWcKWFbaL2Vhx15NOCKNWcrcO8P7zE0tkJu0aWPLN6njqm4sU/SyUdkIaCsY/hcWXDNqxdIDQjX3d4V0TjSh1k+hjxU7o5WURBUOL5htpyh1mCnTaBZ6uYsYYU3UzD+wve7ONVlzY9FZY3Sxwd0dn2wMQJ+I/Qw3dyKns/918sErKT9TdHOC6+A42Ucev38VHxyuprsNnoHnYBcg8BocgHfgCIwBBV/BL/Ab/Ik+R1+ib9H3S2vUW2WegmsV/fwLa5wDxQ==</latexit>

f<latexit sha1_base64="RUptg6j+ygrHHGthrFW1psbNKW8=">AAACt3icdVHBThsxEHUWKDSFAu2xF6srpJ6i9RIRuKH2whGkBpCyq2jWmU2s2N7F9lJFS76g13Lnt/o39YZUgjYZydLTm/f8xp6slMK6KPrdCjY2t95s77xtv9vde79/cPjh2haV4djnhSzMbQYWpdDYd8JJvC0Ngsok3mTTb03/5h6NFYX+7mYlpgrGWuSCg/PUVT48CKPO2elJ3D2hUSeKeixmDYh73eMuZZ5pKiTLuhwetp6SUcErhdpxCdYOWFS6tAbjBJc4byeVxRL4FMY48FCDQpvWi0nn9MgzI5oXxh/t6IJ96ahBWTtTmVcqcBP7b68hV/UGlctP01rosnKo+XNQXknqCto8m46EQe7kzAPgRvhZKZ+AAe7857QTjT94oRToUZ1M0c0HLK0T1LYy2GTVDyFLDOixf+D8tToz8J86kQtpyB5WqBfXxysN1DvCmK5JgvvxuiRvfOHyO/27OLoeXMcd5vFVNzz/utzuDvlEPpMvhJEeOScX5JL0CSdIfpJf5DE4C4ZBHkyepUFr6flIXlVw9wcZm9wS</latexit><latexit sha1_base64="RUptg6j+ygrHHGthrFW1psbNKW8=">AAACt3icdVHBThsxEHUWKDSFAu2xF6srpJ6i9RIRuKH2whGkBpCyq2jWmU2s2N7F9lJFS76g13Lnt/o39YZUgjYZydLTm/f8xp6slMK6KPrdCjY2t95s77xtv9vde79/cPjh2haV4djnhSzMbQYWpdDYd8JJvC0Ngsok3mTTb03/5h6NFYX+7mYlpgrGWuSCg/PUVT48CKPO2elJ3D2hUSeKeixmDYh73eMuZZ5pKiTLuhwetp6SUcErhdpxCdYOWFS6tAbjBJc4byeVxRL4FMY48FCDQpvWi0nn9MgzI5oXxh/t6IJ96ahBWTtTmVcqcBP7b68hV/UGlctP01rosnKo+XNQXknqCto8m46EQe7kzAPgRvhZKZ+AAe7857QTjT94oRToUZ1M0c0HLK0T1LYy2GTVDyFLDOixf+D8tToz8J86kQtpyB5WqBfXxysN1DvCmK5JgvvxuiRvfOHyO/27OLoeXMcd5vFVNzz/utzuDvlEPpMvhJEeOScX5JL0CSdIfpJf5DE4C4ZBHkyepUFr6flIXlVw9wcZm9wS</latexit><latexit sha1_base64="RUptg6j+ygrHHGthrFW1psbNKW8=">AAACt3icdVHBThsxEHUWKDSFAu2xF6srpJ6i9RIRuKH2whGkBpCyq2jWmU2s2N7F9lJFS76g13Lnt/o39YZUgjYZydLTm/f8xp6slMK6KPrdCjY2t95s77xtv9vde79/cPjh2haV4djnhSzMbQYWpdDYd8JJvC0Ngsok3mTTb03/5h6NFYX+7mYlpgrGWuSCg/PUVT48CKPO2elJ3D2hUSeKeixmDYh73eMuZZ5pKiTLuhwetp6SUcErhdpxCdYOWFS6tAbjBJc4byeVxRL4FMY48FCDQpvWi0nn9MgzI5oXxh/t6IJ96ahBWTtTmVcqcBP7b68hV/UGlctP01rosnKo+XNQXknqCto8m46EQe7kzAPgRvhZKZ+AAe7857QTjT94oRToUZ1M0c0HLK0T1LYy2GTVDyFLDOixf+D8tToz8J86kQtpyB5WqBfXxysN1DvCmK5JgvvxuiRvfOHyO/27OLoeXMcd5vFVNzz/utzuDvlEPpMvhJEeOScX5JL0CSdIfpJf5DE4C4ZBHkyepUFr6flIXlVw9wcZm9wS</latexit><latexit sha1_base64="RUptg6j+ygrHHGthrFW1psbNKW8=">AAACt3icdVHBThsxEHUWKDSFAu2xF6srpJ6i9RIRuKH2whGkBpCyq2jWmU2s2N7F9lJFS76g13Lnt/o39YZUgjYZydLTm/f8xp6slMK6KPrdCjY2t95s77xtv9vde79/cPjh2haV4djnhSzMbQYWpdDYd8JJvC0Ngsok3mTTb03/5h6NFYX+7mYlpgrGWuSCg/PUVT48CKPO2elJ3D2hUSeKeixmDYh73eMuZZ5pKiTLuhwetp6SUcErhdpxCdYOWFS6tAbjBJc4byeVxRL4FMY48FCDQpvWi0nn9MgzI5oXxh/t6IJ96ahBWTtTmVcqcBP7b68hV/UGlctP01rosnKo+XNQXknqCto8m46EQe7kzAPgRvhZKZ+AAe7857QTjT94oRToUZ1M0c0HLK0T1LYy2GTVDyFLDOixf+D8tToz8J86kQtpyB5WqBfXxysN1DvCmK5JgvvxuiRvfOHyO/27OLoeXMcd5vFVNzz/utzuDvlEPpMvhJEeOScX5JL0CSdIfpJf5DE4C4ZBHkyepUFr6flIXlVw9wcZm9wS</latexit>

Wn1n2···nN<latexit sha1_base64="NzjHlKZKXDycsy3I/Ptt8f+hwg8=">AAACznicdVHLahsxFJWnr9Tpw2mhm25Eh0BXZmQnTrwL7aar4kIdBzzDoNFcO8J6DJLGxYyHbvsZpdvkh/o3lScuJK1zQXA49xwdXd2sENy6KPrdCh48fPT4yd7T9v6z5y9edg5enVtdGgZjpoU2Fxm1ILiCseNOwEVhgMpMwCRbfNz0J0swlmv11a0KSCSdKz7jjDpPpZ03k7RSKcEq7eGY5dpZDz/XaSeMuqR/ejyIcNTtnwz7ZODB4DjqDYeYdKOmQrStUXrQ+hnnmpUSlGOCWjslUeGSihrHmYC6HZcWCsoWdA5TDxWVYJOqGaDGh57J8Uwbf5TDDXvbUVFp7UpmXimpu7T/9jbkrt60dLPTpOKqKB0odhM0KwV2Gm9+A+fcAHNi5QFlhvu3YnZJDWXO/1k7VvCNaSmpyqt4Aa6ekqSKQdnSwCarWockNlTN/YD1XXVm6H/qWDTSkKx3qJvrezsN2DtCv57dSXQ5vy/JG2+5/E7/Lg7fD857XeLxl6Pw7MN2u3voLXqH3iOCTtAZ+oRGaIwYWqNf6ApdB6NgGdTB9xtp0Np6XqM7Ffz4AwMD5F8=</latexit><latexit sha1_base64="NzjHlKZKXDycsy3I/Ptt8f+hwg8=">AAACznicdVHLahsxFJWnr9Tpw2mhm25Eh0BXZmQnTrwL7aar4kIdBzzDoNFcO8J6DJLGxYyHbvsZpdvkh/o3lScuJK1zQXA49xwdXd2sENy6KPrdCh48fPT4yd7T9v6z5y9edg5enVtdGgZjpoU2Fxm1ILiCseNOwEVhgMpMwCRbfNz0J0swlmv11a0KSCSdKz7jjDpPpZ03k7RSKcEq7eGY5dpZDz/XaSeMuqR/ejyIcNTtnwz7ZODB4DjqDYeYdKOmQrStUXrQ+hnnmpUSlGOCWjslUeGSihrHmYC6HZcWCsoWdA5TDxWVYJOqGaDGh57J8Uwbf5TDDXvbUVFp7UpmXimpu7T/9jbkrt60dLPTpOKqKB0odhM0KwV2Gm9+A+fcAHNi5QFlhvu3YnZJDWXO/1k7VvCNaSmpyqt4Aa6ekqSKQdnSwCarWockNlTN/YD1XXVm6H/qWDTSkKx3qJvrezsN2DtCv57dSXQ5vy/JG2+5/E7/Lg7fD857XeLxl6Pw7MN2u3voLXqH3iOCTtAZ+oRGaIwYWqNf6ApdB6NgGdTB9xtp0Np6XqM7Ffz4AwMD5F8=</latexit><latexit sha1_base64="NzjHlKZKXDycsy3I/Ptt8f+hwg8=">AAACznicdVHLahsxFJWnr9Tpw2mhm25Eh0BXZmQnTrwL7aar4kIdBzzDoNFcO8J6DJLGxYyHbvsZpdvkh/o3lScuJK1zQXA49xwdXd2sENy6KPrdCh48fPT4yd7T9v6z5y9edg5enVtdGgZjpoU2Fxm1ILiCseNOwEVhgMpMwCRbfNz0J0swlmv11a0KSCSdKz7jjDpPpZ03k7RSKcEq7eGY5dpZDz/XaSeMuqR/ejyIcNTtnwz7ZODB4DjqDYeYdKOmQrStUXrQ+hnnmpUSlGOCWjslUeGSihrHmYC6HZcWCsoWdA5TDxWVYJOqGaDGh57J8Uwbf5TDDXvbUVFp7UpmXimpu7T/9jbkrt60dLPTpOKqKB0odhM0KwV2Gm9+A+fcAHNi5QFlhvu3YnZJDWXO/1k7VvCNaSmpyqt4Aa6ekqSKQdnSwCarWockNlTN/YD1XXVm6H/qWDTSkKx3qJvrezsN2DtCv57dSXQ5vy/JG2+5/E7/Lg7fD857XeLxl6Pw7MN2u3voLXqH3iOCTtAZ+oRGaIwYWqNf6ApdB6NgGdTB9xtp0Np6XqM7Ffz4AwMD5F8=</latexit><latexit sha1_base64="NzjHlKZKXDycsy3I/Ptt8f+hwg8=">AAACznicdVHLahsxFJWnr9Tpw2mhm25Eh0BXZmQnTrwL7aar4kIdBzzDoNFcO8J6DJLGxYyHbvsZpdvkh/o3lScuJK1zQXA49xwdXd2sENy6KPrdCh48fPT4yd7T9v6z5y9edg5enVtdGgZjpoU2Fxm1ILiCseNOwEVhgMpMwCRbfNz0J0swlmv11a0KSCSdKz7jjDpPpZ03k7RSKcEq7eGY5dpZDz/XaSeMuqR/ejyIcNTtnwz7ZODB4DjqDYeYdKOmQrStUXrQ+hnnmpUSlGOCWjslUeGSihrHmYC6HZcWCsoWdA5TDxWVYJOqGaDGh57J8Uwbf5TDDXvbUVFp7UpmXimpu7T/9jbkrt60dLPTpOKqKB0odhM0KwV2Gm9+A+fcAHNi5QFlhvu3YnZJDWXO/1k7VvCNaSmpyqt4Aa6ekqSKQdnSwCarWockNlTN/YD1XXVm6H/qWDTSkKx3qJvrezsN2DtCv57dSXQ5vy/JG2+5/E7/Lg7fD857XeLxl6Pw7MN2u3voLXqH3iOCTtAZ+oRGaIwYWqNf6ApdB6NgGdTB9xtp0Np6XqM7Ffz4AwMD5F8=</latexit>

Novikov, Trofimov, Oseledets, arxiv:1605.03795Stoudenmire, Schwab, arxiv:1605.05775

Page 77: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

How to get a class of functions where a huge (order-N) tensor appears?

Consider a polynomial over N variables:

f(x1, x2, . . . , xN ) =X

n

Wn1n2···nNxn11 xn2

2 · · ·xnNN

<latexit sha1_base64="3xvt1rklVBeHb3XSnStd7JXjde0=">AAADIHicdVFNaxQxGM6OH63r11aPXoKDUKGUzSDoRSj04mmp4HYLm3HIZDK7YZPMkGTaLun8F/HXeFI86u/wqphMV2i76wuBJ89H3iRvXgtu7HD4oxfdun3n7tb2vf79Bw8fPR7sPDk2VaMpG9NKVPokJ4YJrtjYcivYSa0Zkblgk3xxGPTJKdOGV+qDXdYslWSmeMkpsZ7KBvNy9zxDe/A8S/YgFkVlTdiMXsK3EJtGZg5LYud56VTbwknmVIagyhKIafB6OGq9H30MQkBJQEn7T/dHBWLUZoN4uD/sCq4DtAIxWNVRttP7hIuKNpIpSwUxZoqGtU0d0ZZTwdo+bgyrCV2QGZt6qIhkJnXdl7TwhWcKWFbaL2Vhx15NOCKNWcrcO8P7zE0tkJu0aWPLN6njqm4sU/SyUdkIaCsY/hcWXDNqxdIDQjX3d4V0TjSh1k+hjxU7o5WURBUOL5htpyh1mCnTaBZ6uYsYYU3UzD+wve7ONVlzY9FZY3Sxwd0dn2wMQJ+I/Qw3dyKns/918sErKT9TdHOC6+A42Ucev38VHxyuprsNnoHnYBcg8BocgHfgCIwBBV/BL/Ab/Ik+R1+ib9H3S2vUW2WegmsV/fwLa5wDxQ==</latexit><latexit sha1_base64="3xvt1rklVBeHb3XSnStd7JXjde0=">AAADIHicdVFNaxQxGM6OH63r11aPXoKDUKGUzSDoRSj04mmp4HYLm3HIZDK7YZPMkGTaLun8F/HXeFI86u/wqphMV2i76wuBJ89H3iRvXgtu7HD4oxfdun3n7tb2vf79Bw8fPR7sPDk2VaMpG9NKVPokJ4YJrtjYcivYSa0Zkblgk3xxGPTJKdOGV+qDXdYslWSmeMkpsZ7KBvNy9zxDe/A8S/YgFkVlTdiMXsK3EJtGZg5LYud56VTbwknmVIagyhKIafB6OGq9H30MQkBJQEn7T/dHBWLUZoN4uD/sCq4DtAIxWNVRttP7hIuKNpIpSwUxZoqGtU0d0ZZTwdo+bgyrCV2QGZt6qIhkJnXdl7TwhWcKWFbaL2Vhx15NOCKNWcrcO8P7zE0tkJu0aWPLN6njqm4sU/SyUdkIaCsY/hcWXDNqxdIDQjX3d4V0TjSh1k+hjxU7o5WURBUOL5htpyh1mCnTaBZ6uYsYYU3UzD+wve7ONVlzY9FZY3Sxwd0dn2wMQJ+I/Qw3dyKns/918sErKT9TdHOC6+A42Ucev38VHxyuprsNnoHnYBcg8BocgHfgCIwBBV/BL/Ab/Ik+R1+ib9H3S2vUW2WegmsV/fwLa5wDxQ==</latexit><latexit sha1_base64="3xvt1rklVBeHb3XSnStd7JXjde0=">AAADIHicdVFNaxQxGM6OH63r11aPXoKDUKGUzSDoRSj04mmp4HYLm3HIZDK7YZPMkGTaLun8F/HXeFI86u/wqphMV2i76wuBJ89H3iRvXgtu7HD4oxfdun3n7tb2vf79Bw8fPR7sPDk2VaMpG9NKVPokJ4YJrtjYcivYSa0Zkblgk3xxGPTJKdOGV+qDXdYslWSmeMkpsZ7KBvNy9zxDe/A8S/YgFkVlTdiMXsK3EJtGZg5LYud56VTbwknmVIagyhKIafB6OGq9H30MQkBJQEn7T/dHBWLUZoN4uD/sCq4DtAIxWNVRttP7hIuKNpIpSwUxZoqGtU0d0ZZTwdo+bgyrCV2QGZt6qIhkJnXdl7TwhWcKWFbaL2Vhx15NOCKNWcrcO8P7zE0tkJu0aWPLN6njqm4sU/SyUdkIaCsY/hcWXDNqxdIDQjX3d4V0TjSh1k+hjxU7o5WURBUOL5htpyh1mCnTaBZ6uYsYYU3UzD+wve7ONVlzY9FZY3Sxwd0dn2wMQJ+I/Qw3dyKns/918sErKT9TdHOC6+A42Ucev38VHxyuprsNnoHnYBcg8BocgHfgCIwBBV/BL/Ab/Ik+R1+ib9H3S2vUW2WegmsV/fwLa5wDxQ==</latexit><latexit sha1_base64="3xvt1rklVBeHb3XSnStd7JXjde0=">AAADIHicdVFNaxQxGM6OH63r11aPXoKDUKGUzSDoRSj04mmp4HYLm3HIZDK7YZPMkGTaLun8F/HXeFI86u/wqphMV2i76wuBJ89H3iRvXgtu7HD4oxfdun3n7tb2vf79Bw8fPR7sPDk2VaMpG9NKVPokJ4YJrtjYcivYSa0Zkblgk3xxGPTJKdOGV+qDXdYslWSmeMkpsZ7KBvNy9zxDe/A8S/YgFkVlTdiMXsK3EJtGZg5LYud56VTbwknmVIagyhKIafB6OGq9H30MQkBJQEn7T/dHBWLUZoN4uD/sCq4DtAIxWNVRttP7hIuKNpIpSwUxZoqGtU0d0ZZTwdo+bgyrCV2QGZt6qIhkJnXdl7TwhWcKWFbaL2Vhx15NOCKNWcrcO8P7zE0tkJu0aWPLN6njqm4sU/SyUdkIaCsY/hcWXDNqxdIDQjX3d4V0TjSh1k+hjxU7o5WURBUOL5htpyh1mCnTaBZ6uYsYYU3UzD+wve7ONVlzY9FZY3Sxwd0dn2wMQJ+I/Qw3dyKns/918sErKT9TdHOC6+A42Ucev38VHxyuprsNnoHnYBcg8BocgHfgCIwBBV/BL/Ab/Ik+R1+ib9H3S2vUW2WegmsV/fwLa5wDxQ==</latexit>

+W11000 x1x2 +W10100 x1x3 + . . .<latexit sha1_base64="B2LXDTa85seanIjmh6C5JwEVpR8=">AAAC9HicdVFNaxsxEJU3/UjdjzjpsRfRpVAohJVbSA89BHLpMYU6DljLotWOHWF9LJLWjdnsP+mhtPTa39Fre+u/qbzZgxM7AwOPN+9pNDN5KYXzSfKvF+3cu//g4e6j/uMnT5/tDfYPzpypLIcRN9LY85w5kELDyAsv4by0wFQuYZzPT1b18QKsE0Z/9ssSUsVmWkwFZz5Q2eADVbm5rBv8Bo+zmpAkSRpM8WVGQg47NiHr7NvAUlkY77JBnBwmbeBNQDoQoy5Os/3eN1oYXinQnkvm3IQkpU9rZr3gEpo+rRyUjM/ZDCYBaqbApXU7ZoNfBabAU2NDao9bdt1RM+XcUuVBqZi/cLdrK3JbbVL56fu0FrqsPGh+3WhaSewNXu0MF8IC93IZAONWhL9ifsEs4z5stk81fOFGKaaLms7BNxOS1hS0qyysetVXMaGW6VkYsLmpzi3bUFPZSmNytUXdPj/casDBEQ/xHZ3YYnZXp2Bcc4WbktsX3ARnw0MS8Kd38fFJd91d9AK9RK8RQUfoGH1Ep2iEOPqOfqM/6G+0iL5GP6Kf19Ko13meoxsR/foPB0zvkg==</latexit><latexit sha1_base64="B2LXDTa85seanIjmh6C5JwEVpR8=">AAAC9HicdVFNaxsxEJU3/UjdjzjpsRfRpVAohJVbSA89BHLpMYU6DljLotWOHWF9LJLWjdnsP+mhtPTa39Fre+u/qbzZgxM7AwOPN+9pNDN5KYXzSfKvF+3cu//g4e6j/uMnT5/tDfYPzpypLIcRN9LY85w5kELDyAsv4by0wFQuYZzPT1b18QKsE0Z/9ssSUsVmWkwFZz5Q2eADVbm5rBv8Bo+zmpAkSRpM8WVGQg47NiHr7NvAUlkY77JBnBwmbeBNQDoQoy5Os/3eN1oYXinQnkvm3IQkpU9rZr3gEpo+rRyUjM/ZDCYBaqbApXU7ZoNfBabAU2NDao9bdt1RM+XcUuVBqZi/cLdrK3JbbVL56fu0FrqsPGh+3WhaSewNXu0MF8IC93IZAONWhL9ifsEs4z5stk81fOFGKaaLms7BNxOS1hS0qyysetVXMaGW6VkYsLmpzi3bUFPZSmNytUXdPj/casDBEQ/xHZ3YYnZXp2Bcc4WbktsX3ARnw0MS8Kd38fFJd91d9AK9RK8RQUfoGH1Ep2iEOPqOfqM/6G+0iL5GP6Kf19Ko13meoxsR/foPB0zvkg==</latexit><latexit sha1_base64="B2LXDTa85seanIjmh6C5JwEVpR8=">AAAC9HicdVFNaxsxEJU3/UjdjzjpsRfRpVAohJVbSA89BHLpMYU6DljLotWOHWF9LJLWjdnsP+mhtPTa39Fre+u/qbzZgxM7AwOPN+9pNDN5KYXzSfKvF+3cu//g4e6j/uMnT5/tDfYPzpypLIcRN9LY85w5kELDyAsv4by0wFQuYZzPT1b18QKsE0Z/9ssSUsVmWkwFZz5Q2eADVbm5rBv8Bo+zmpAkSRpM8WVGQg47NiHr7NvAUlkY77JBnBwmbeBNQDoQoy5Os/3eN1oYXinQnkvm3IQkpU9rZr3gEpo+rRyUjM/ZDCYBaqbApXU7ZoNfBabAU2NDao9bdt1RM+XcUuVBqZi/cLdrK3JbbVL56fu0FrqsPGh+3WhaSewNXu0MF8IC93IZAONWhL9ifsEs4z5stk81fOFGKaaLms7BNxOS1hS0qyysetVXMaGW6VkYsLmpzi3bUFPZSmNytUXdPj/casDBEQ/xHZ3YYnZXp2Bcc4WbktsX3ARnw0MS8Kd38fFJd91d9AK9RK8RQUfoGH1Ep2iEOPqOfqM/6G+0iL5GP6Kf19Ko13meoxsR/foPB0zvkg==</latexit><latexit sha1_base64="B2LXDTa85seanIjmh6C5JwEVpR8=">AAAC9HicdVFNaxsxEJU3/UjdjzjpsRfRpVAohJVbSA89BHLpMYU6DljLotWOHWF9LJLWjdnsP+mhtPTa39Fre+u/qbzZgxM7AwOPN+9pNDN5KYXzSfKvF+3cu//g4e6j/uMnT5/tDfYPzpypLIcRN9LY85w5kELDyAsv4by0wFQuYZzPT1b18QKsE0Z/9ssSUsVmWkwFZz5Q2eADVbm5rBv8Bo+zmpAkSRpM8WVGQg47NiHr7NvAUlkY77JBnBwmbeBNQDoQoy5Os/3eN1oYXinQnkvm3IQkpU9rZr3gEpo+rRyUjM/ZDCYBaqbApXU7ZoNfBabAU2NDao9bdt1RM+XcUuVBqZi/cLdrK3JbbVL56fu0FrqsPGh+3WhaSewNXu0MF8IC93IZAONWhL9ifsEs4z5stk81fOFGKaaLms7BNxOS1hS0qyysetVXMaGW6VkYsLmpzi3bUFPZSmNytUXdPj/casDBEQ/xHZ3YYnZXp2Bcc4WbktsX3ARnw0MS8Kd38fFJd91d9AK9RK8RQUfoGH1Ep2iEOPqOfqM/6G+0iL5GP6Kf19Ko13meoxsR/foPB0zvkg==</latexit>

= W00000 +W10000 x1 +W01000 x2 + . . .<latexit sha1_base64="zweIsfSxMX2zTjREZRP5z+V95uQ=">AAAC8HicdVHLahsxFJWnj6TuI0677EZ0KBQKYWQKzaYQyKbLFOo4YA2DRnPtCOsxSJqkZjL/kVWTbvsd3bb7/k01Y1OS2LkgOJx7jo6ubl5K4XyS/O1FDx4+ery1/aT/9NnzFzuD3ZfHzlSWw4gbaexJzhxIoWHkhZdwUlpgKpcwzueHbX98BtYJo7/6RQmpYjMtpoIzH6hssP8Jj7M6aavB71tMlpjibxlZMgn5zwwDQ2VhvMsGcbLX+RK8DsgKxGhVR9lu75oWhlcKtOeSOTchSenTmlkvuISmTysHJeNzNoNJgJopcGndjdjgt4Ep8NTYcLTHHXvTUTPl3ELlQamYP3V3ey25qTep/HQ/rYUuKw+aL4OmlcTe4Pa/cCEscC8XATBuRXgr5qfMMu7Dr/aphnNulGK6qOkcfDMhaU1Bu8pCm1VfxIRapmdhwOa2OrdsTU1lJ43JxQZ1d/1wowEHRzzE9ySxs9l9ScF4wxV2Su5ucB0cD/dIwF8+xAeHq+1uo9foDXqHCPqIDtBndIRGiKPv6Bf6jf5ENrqMrqIfS2nUW3leoVsV/fwHYpXtBw==</latexit><latexit sha1_base64="zweIsfSxMX2zTjREZRP5z+V95uQ=">AAAC8HicdVHLahsxFJWnj6TuI0677EZ0KBQKYWQKzaYQyKbLFOo4YA2DRnPtCOsxSJqkZjL/kVWTbvsd3bb7/k01Y1OS2LkgOJx7jo6ubl5K4XyS/O1FDx4+ery1/aT/9NnzFzuD3ZfHzlSWw4gbaexJzhxIoWHkhZdwUlpgKpcwzueHbX98BtYJo7/6RQmpYjMtpoIzH6hssP8Jj7M6aavB71tMlpjibxlZMgn5zwwDQ2VhvMsGcbLX+RK8DsgKxGhVR9lu75oWhlcKtOeSOTchSenTmlkvuISmTysHJeNzNoNJgJopcGndjdjgt4Ep8NTYcLTHHXvTUTPl3ELlQamYP3V3ey25qTep/HQ/rYUuKw+aL4OmlcTe4Pa/cCEscC8XATBuRXgr5qfMMu7Dr/aphnNulGK6qOkcfDMhaU1Bu8pCm1VfxIRapmdhwOa2OrdsTU1lJ43JxQZ1d/1wowEHRzzE9ySxs9l9ScF4wxV2Su5ucB0cD/dIwF8+xAeHq+1uo9foDXqHCPqIDtBndIRGiKPv6Bf6jf5ENrqMrqIfS2nUW3leoVsV/fwHYpXtBw==</latexit><latexit sha1_base64="zweIsfSxMX2zTjREZRP5z+V95uQ=">AAAC8HicdVHLahsxFJWnj6TuI0677EZ0KBQKYWQKzaYQyKbLFOo4YA2DRnPtCOsxSJqkZjL/kVWTbvsd3bb7/k01Y1OS2LkgOJx7jo6ubl5K4XyS/O1FDx4+ery1/aT/9NnzFzuD3ZfHzlSWw4gbaexJzhxIoWHkhZdwUlpgKpcwzueHbX98BtYJo7/6RQmpYjMtpoIzH6hssP8Jj7M6aavB71tMlpjibxlZMgn5zwwDQ2VhvMsGcbLX+RK8DsgKxGhVR9lu75oWhlcKtOeSOTchSenTmlkvuISmTysHJeNzNoNJgJopcGndjdjgt4Ep8NTYcLTHHXvTUTPl3ELlQamYP3V3ey25qTep/HQ/rYUuKw+aL4OmlcTe4Pa/cCEscC8XATBuRXgr5qfMMu7Dr/aphnNulGK6qOkcfDMhaU1Bu8pCm1VfxIRapmdhwOa2OrdsTU1lJ43JxQZ1d/1wowEHRzzE9ySxs9l9ScF4wxV2Su5ucB0cD/dIwF8+xAeHq+1uo9foDXqHCPqIDtBndIRGiKPv6Bf6jf5ENrqMrqIfS2nUW3leoVsV/fwHYpXtBw==</latexit><latexit sha1_base64="zweIsfSxMX2zTjREZRP5z+V95uQ=">AAAC8HicdVHLahsxFJWnj6TuI0677EZ0KBQKYWQKzaYQyKbLFOo4YA2DRnPtCOsxSJqkZjL/kVWTbvsd3bb7/k01Y1OS2LkgOJx7jo6ubl5K4XyS/O1FDx4+ery1/aT/9NnzFzuD3ZfHzlSWw4gbaexJzhxIoWHkhZdwUlpgKpcwzueHbX98BtYJo7/6RQmpYjMtpoIzH6hssP8Jj7M6aavB71tMlpjibxlZMgn5zwwDQ2VhvMsGcbLX+RK8DsgKxGhVR9lu75oWhlcKtOeSOTchSenTmlkvuISmTysHJeNzNoNJgJopcGndjdjgt4Ep8NTYcLTHHXvTUTPl3ELlQamYP3V3ey25qTep/HQ/rYUuKw+aL4OmlcTe4Pa/cCEscC8XATBuRXgr5qfMMu7Dr/aphnNulGK6qOkcfDMhaU1Bu8pCm1VfxIRapmdhwOa2OrdsTU1lJ43JxQZ1d/1wowEHRzzE9ySxs9l9ScF4wxV2Su5ucB0cD/dIwF8+xAeHq+1uo9foDXqHCPqIDtBndIRGiKPv6Bf6jf5ENrqMrqIfS2nUW3leoVsV/fwHYpXtBw==</latexit>

+W11111 x1x2x3x4x5<latexit sha1_base64="DhY/5WlSUnbkkqetBwCHqTJL+xU=">AAAC2XicdVFNa9wwENU6/Ui3X5v02ItaUygUgrVJSY+BXHpMoZsN2MaMtbMbsZJsJDnN4vjQU0Kv/QP9A722v6X/prKzhyS7GXjwmHlPo5nJSymsi6J/vWDjwcNHjzef9J8+e/7i5WBr+9gWleE44oUszEkOFqXQOHLCSTwpDYLKJY7z+WFbH5+hsaLQX92ixFTBTIup4OB8Khu8+UDHWc3aaGhCzzPmMfTY9djz+JgNwmgn6oKuErYkIVnGUbbV+5VMCl4p1I5LsDZmUenSGowTXGLTTyqLJfA5zDD2VINCm9bdLA195zMTOi2Mh3a0y9501KCsXajcKxW4U3u31ibX1eLKTT+ltdBl5VDz60bTSlJX0HYxdCIMcicXngA3wv+V8lMwwJ1fXz/R+I0XSoGe1MkcXROztE5Q28pg26u+CFliQM/8gM1tdW5gRZ3IThqyizXq7vnhWgP1jnBI7+kEZ7P7OnnjDZe/Kbt7wVVyPNxhnn/ZCw8Ol9fdJK/JW/KeMLJPDshnckRGhJNL8pv8IX+DOPgeXAU/rqVBb+l5RW5F8PM/63/mGA==</latexit><latexit sha1_base64="DhY/5WlSUnbkkqetBwCHqTJL+xU=">AAAC2XicdVFNa9wwENU6/Ui3X5v02ItaUygUgrVJSY+BXHpMoZsN2MaMtbMbsZJsJDnN4vjQU0Kv/QP9A722v6X/prKzhyS7GXjwmHlPo5nJSymsi6J/vWDjwcNHjzef9J8+e/7i5WBr+9gWleE44oUszEkOFqXQOHLCSTwpDYLKJY7z+WFbH5+hsaLQX92ixFTBTIup4OB8Khu8+UDHWc3aaGhCzzPmMfTY9djz+JgNwmgn6oKuErYkIVnGUbbV+5VMCl4p1I5LsDZmUenSGowTXGLTTyqLJfA5zDD2VINCm9bdLA195zMTOi2Mh3a0y9501KCsXajcKxW4U3u31ibX1eLKTT+ltdBl5VDz60bTSlJX0HYxdCIMcicXngA3wv+V8lMwwJ1fXz/R+I0XSoGe1MkcXROztE5Q28pg26u+CFliQM/8gM1tdW5gRZ3IThqyizXq7vnhWgP1jnBI7+kEZ7P7OnnjDZe/Kbt7wVVyPNxhnn/ZCw8Ol9fdJK/JW/KeMLJPDshnckRGhJNL8pv8IX+DOPgeXAU/rqVBb+l5RW5F8PM/63/mGA==</latexit><latexit sha1_base64="DhY/5WlSUnbkkqetBwCHqTJL+xU=">AAAC2XicdVFNa9wwENU6/Ui3X5v02ItaUygUgrVJSY+BXHpMoZsN2MaMtbMbsZJsJDnN4vjQU0Kv/QP9A722v6X/prKzhyS7GXjwmHlPo5nJSymsi6J/vWDjwcNHjzef9J8+e/7i5WBr+9gWleE44oUszEkOFqXQOHLCSTwpDYLKJY7z+WFbH5+hsaLQX92ixFTBTIup4OB8Khu8+UDHWc3aaGhCzzPmMfTY9djz+JgNwmgn6oKuErYkIVnGUbbV+5VMCl4p1I5LsDZmUenSGowTXGLTTyqLJfA5zDD2VINCm9bdLA195zMTOi2Mh3a0y9501KCsXajcKxW4U3u31ibX1eLKTT+ltdBl5VDz60bTSlJX0HYxdCIMcicXngA3wv+V8lMwwJ1fXz/R+I0XSoGe1MkcXROztE5Q28pg26u+CFliQM/8gM1tdW5gRZ3IThqyizXq7vnhWgP1jnBI7+kEZ7P7OnnjDZe/Kbt7wVVyPNxhnn/ZCw8Ol9fdJK/JW/KeMLJPDshnckRGhJNL8pv8IX+DOPgeXAU/rqVBb+l5RW5F8PM/63/mGA==</latexit><latexit sha1_base64="DhY/5WlSUnbkkqetBwCHqTJL+xU=">AAAC2XicdVFNa9wwENU6/Ui3X5v02ItaUygUgrVJSY+BXHpMoZsN2MaMtbMbsZJsJDnN4vjQU0Kv/QP9A722v6X/prKzhyS7GXjwmHlPo5nJSymsi6J/vWDjwcNHjzef9J8+e/7i5WBr+9gWleE44oUszEkOFqXQOHLCSTwpDYLKJY7z+WFbH5+hsaLQX92ixFTBTIup4OB8Khu8+UDHWc3aaGhCzzPmMfTY9djz+JgNwmgn6oKuErYkIVnGUbbV+5VMCl4p1I5LsDZmUenSGowTXGLTTyqLJfA5zDD2VINCm9bdLA195zMTOi2Mh3a0y9501KCsXajcKxW4U3u31ibX1eLKTT+ltdBl5VDz60bTSlJX0HYxdCIMcicXngA3wv+V8lMwwJ1fXz/R+I0XSoGe1MkcXROztE5Q28pg26u+CFliQM/8gM1tdW5gRZ3IThqyizXq7vnhWgP1jnBI7+kEZ7P7OnnjDZe/Kbt7wVVyPNxhnn/ZCw8Ol9fdJK/JW/KeMLJPDshnckRGhJNL8pv8IX+DOPgeXAU/rqVBb+l5RW5F8PM/63/mGA==</latexit>

f<latexit sha1_base64="RUptg6j+ygrHHGthrFW1psbNKW8=">AAACt3icdVHBThsxEHUWKDSFAu2xF6srpJ6i9RIRuKH2whGkBpCyq2jWmU2s2N7F9lJFS76g13Lnt/o39YZUgjYZydLTm/f8xp6slMK6KPrdCjY2t95s77xtv9vde79/cPjh2haV4djnhSzMbQYWpdDYd8JJvC0Ngsok3mTTb03/5h6NFYX+7mYlpgrGWuSCg/PUVT48CKPO2elJ3D2hUSeKeixmDYh73eMuZZ5pKiTLuhwetp6SUcErhdpxCdYOWFS6tAbjBJc4byeVxRL4FMY48FCDQpvWi0nn9MgzI5oXxh/t6IJ96ahBWTtTmVcqcBP7b68hV/UGlctP01rosnKo+XNQXknqCto8m46EQe7kzAPgRvhZKZ+AAe7857QTjT94oRToUZ1M0c0HLK0T1LYy2GTVDyFLDOixf+D8tToz8J86kQtpyB5WqBfXxysN1DvCmK5JgvvxuiRvfOHyO/27OLoeXMcd5vFVNzz/utzuDvlEPpMvhJEeOScX5JL0CSdIfpJf5DE4C4ZBHkyepUFr6flIXlVw9wcZm9wS</latexit><latexit sha1_base64="RUptg6j+ygrHHGthrFW1psbNKW8=">AAACt3icdVHBThsxEHUWKDSFAu2xF6srpJ6i9RIRuKH2whGkBpCyq2jWmU2s2N7F9lJFS76g13Lnt/o39YZUgjYZydLTm/f8xp6slMK6KPrdCjY2t95s77xtv9vde79/cPjh2haV4djnhSzMbQYWpdDYd8JJvC0Ngsok3mTTb03/5h6NFYX+7mYlpgrGWuSCg/PUVT48CKPO2elJ3D2hUSeKeixmDYh73eMuZZ5pKiTLuhwetp6SUcErhdpxCdYOWFS6tAbjBJc4byeVxRL4FMY48FCDQpvWi0nn9MgzI5oXxh/t6IJ96ahBWTtTmVcqcBP7b68hV/UGlctP01rosnKo+XNQXknqCto8m46EQe7kzAPgRvhZKZ+AAe7857QTjT94oRToUZ1M0c0HLK0T1LYy2GTVDyFLDOixf+D8tToz8J86kQtpyB5WqBfXxysN1DvCmK5JgvvxuiRvfOHyO/27OLoeXMcd5vFVNzz/utzuDvlEPpMvhJEeOScX5JL0CSdIfpJf5DE4C4ZBHkyepUFr6flIXlVw9wcZm9wS</latexit><latexit sha1_base64="RUptg6j+ygrHHGthrFW1psbNKW8=">AAACt3icdVHBThsxEHUWKDSFAu2xF6srpJ6i9RIRuKH2whGkBpCyq2jWmU2s2N7F9lJFS76g13Lnt/o39YZUgjYZydLTm/f8xp6slMK6KPrdCjY2t95s77xtv9vde79/cPjh2haV4djnhSzMbQYWpdDYd8JJvC0Ngsok3mTTb03/5h6NFYX+7mYlpgrGWuSCg/PUVT48CKPO2elJ3D2hUSeKeixmDYh73eMuZZ5pKiTLuhwetp6SUcErhdpxCdYOWFS6tAbjBJc4byeVxRL4FMY48FCDQpvWi0nn9MgzI5oXxh/t6IJ96ahBWTtTmVcqcBP7b68hV/UGlctP01rosnKo+XNQXknqCto8m46EQe7kzAPgRvhZKZ+AAe7857QTjT94oRToUZ1M0c0HLK0T1LYy2GTVDyFLDOixf+D8tToz8J86kQtpyB5WqBfXxysN1DvCmK5JgvvxuiRvfOHyO/27OLoeXMcd5vFVNzz/utzuDvlEPpMvhJEeOScX5JL0CSdIfpJf5DE4C4ZBHkyepUFr6flIXlVw9wcZm9wS</latexit><latexit sha1_base64="RUptg6j+ygrHHGthrFW1psbNKW8=">AAACt3icdVHBThsxEHUWKDSFAu2xF6srpJ6i9RIRuKH2whGkBpCyq2jWmU2s2N7F9lJFS76g13Lnt/o39YZUgjYZydLTm/f8xp6slMK6KPrdCjY2t95s77xtv9vde79/cPjh2haV4djnhSzMbQYWpdDYd8JJvC0Ngsok3mTTb03/5h6NFYX+7mYlpgrGWuSCg/PUVT48CKPO2elJ3D2hUSeKeixmDYh73eMuZZ5pKiTLuhwetp6SUcErhdpxCdYOWFS6tAbjBJc4byeVxRL4FMY48FCDQpvWi0nn9MgzI5oXxh/t6IJ96ahBWTtTmVcqcBP7b68hV/UGlctP01rosnKo+XNQXknqCto8m46EQe7kzAPgRvhZKZ+AAe7857QTjT94oRToUZ1M0c0HLK0T1LYy2GTVDyFLDOixf+D8tToz8J86kQtpyB5WqBfXxysN1DvCmK5JgvvxuiRvfOHyO/27OLoeXMcd5vFVNzz/utzuDvlEPpMvhJEeOScX5JL0CSdIfpJf5DE4C4ZBHkyepUFr6flIXlVw9wcZm9wS</latexit>

Wn1n2···nN<latexit sha1_base64="NzjHlKZKXDycsy3I/Ptt8f+hwg8=">AAACznicdVHLahsxFJWnr9Tpw2mhm25Eh0BXZmQnTrwL7aar4kIdBzzDoNFcO8J6DJLGxYyHbvsZpdvkh/o3lScuJK1zQXA49xwdXd2sENy6KPrdCh48fPT4yd7T9v6z5y9edg5enVtdGgZjpoU2Fxm1ILiCseNOwEVhgMpMwCRbfNz0J0swlmv11a0KSCSdKz7jjDpPpZ03k7RSKcEq7eGY5dpZDz/XaSeMuqR/ejyIcNTtnwz7ZODB4DjqDYeYdKOmQrStUXrQ+hnnmpUSlGOCWjslUeGSihrHmYC6HZcWCsoWdA5TDxWVYJOqGaDGh57J8Uwbf5TDDXvbUVFp7UpmXimpu7T/9jbkrt60dLPTpOKqKB0odhM0KwV2Gm9+A+fcAHNi5QFlhvu3YnZJDWXO/1k7VvCNaSmpyqt4Aa6ekqSKQdnSwCarWockNlTN/YD1XXVm6H/qWDTSkKx3qJvrezsN2DtCv57dSXQ5vy/JG2+5/E7/Lg7fD857XeLxl6Pw7MN2u3voLXqH3iOCTtAZ+oRGaIwYWqNf6ApdB6NgGdTB9xtp0Np6XqM7Ffz4AwMD5F8=</latexit><latexit sha1_base64="NzjHlKZKXDycsy3I/Ptt8f+hwg8=">AAACznicdVHLahsxFJWnr9Tpw2mhm25Eh0BXZmQnTrwL7aar4kIdBzzDoNFcO8J6DJLGxYyHbvsZpdvkh/o3lScuJK1zQXA49xwdXd2sENy6KPrdCh48fPT4yd7T9v6z5y9edg5enVtdGgZjpoU2Fxm1ILiCseNOwEVhgMpMwCRbfNz0J0swlmv11a0KSCSdKz7jjDpPpZ03k7RSKcEq7eGY5dpZDz/XaSeMuqR/ejyIcNTtnwz7ZODB4DjqDYeYdKOmQrStUXrQ+hnnmpUSlGOCWjslUeGSihrHmYC6HZcWCsoWdA5TDxWVYJOqGaDGh57J8Uwbf5TDDXvbUVFp7UpmXimpu7T/9jbkrt60dLPTpOKqKB0odhM0KwV2Gm9+A+fcAHNi5QFlhvu3YnZJDWXO/1k7VvCNaSmpyqt4Aa6ekqSKQdnSwCarWockNlTN/YD1XXVm6H/qWDTSkKx3qJvrezsN2DtCv57dSXQ5vy/JG2+5/E7/Lg7fD857XeLxl6Pw7MN2u3voLXqH3iOCTtAZ+oRGaIwYWqNf6ApdB6NgGdTB9xtp0Np6XqM7Ffz4AwMD5F8=</latexit><latexit sha1_base64="NzjHlKZKXDycsy3I/Ptt8f+hwg8=">AAACznicdVHLahsxFJWnr9Tpw2mhm25Eh0BXZmQnTrwL7aar4kIdBzzDoNFcO8J6DJLGxYyHbvsZpdvkh/o3lScuJK1zQXA49xwdXd2sENy6KPrdCh48fPT4yd7T9v6z5y9edg5enVtdGgZjpoU2Fxm1ILiCseNOwEVhgMpMwCRbfNz0J0swlmv11a0KSCSdKz7jjDpPpZ03k7RSKcEq7eGY5dpZDz/XaSeMuqR/ejyIcNTtnwz7ZODB4DjqDYeYdKOmQrStUXrQ+hnnmpUSlGOCWjslUeGSihrHmYC6HZcWCsoWdA5TDxWVYJOqGaDGh57J8Uwbf5TDDXvbUVFp7UpmXimpu7T/9jbkrt60dLPTpOKqKB0odhM0KwV2Gm9+A+fcAHNi5QFlhvu3YnZJDWXO/1k7VvCNaSmpyqt4Aa6ekqSKQdnSwCarWockNlTN/YD1XXVm6H/qWDTSkKx3qJvrezsN2DtCv57dSXQ5vy/JG2+5/E7/Lg7fD857XeLxl6Pw7MN2u3voLXqH3iOCTtAZ+oRGaIwYWqNf6ApdB6NgGdTB9xtp0Np6XqM7Ffz4AwMD5F8=</latexit><latexit sha1_base64="NzjHlKZKXDycsy3I/Ptt8f+hwg8=">AAACznicdVHLahsxFJWnr9Tpw2mhm25Eh0BXZmQnTrwL7aar4kIdBzzDoNFcO8J6DJLGxYyHbvsZpdvkh/o3lScuJK1zQXA49xwdXd2sENy6KPrdCh48fPT4yd7T9v6z5y9edg5enVtdGgZjpoU2Fxm1ILiCseNOwEVhgMpMwCRbfNz0J0swlmv11a0KSCSdKz7jjDpPpZ03k7RSKcEq7eGY5dpZDz/XaSeMuqR/ejyIcNTtnwz7ZODB4DjqDYeYdKOmQrStUXrQ+hnnmpUSlGOCWjslUeGSihrHmYC6HZcWCsoWdA5TDxWVYJOqGaDGh57J8Uwbf5TDDXvbUVFp7UpmXimpu7T/9jbkrt60dLPTpOKqKB0odhM0KwV2Gm9+A+fcAHNi5QFlhvu3YnZJDWXO/1k7VvCNaSmpyqt4Aa6ekqSKQdnSwCarWockNlTN/YD1XXVm6H/qWDTSkKx3qJvrezsN2DtCv57dSXQ5vy/JG2+5/E7/Lg7fD857XeLxl6Pw7MN2u3voLXqH3iOCTtAZ+oRGaIwYWqNf6ApdB6NgGdTB9xtp0Np6XqM7Ffz4AwMD5F8=</latexit>

Novikov, Trofimov, Oseledets, arxiv:1605.03795Stoudenmire, Schwab, arxiv:1605.05775

Page 78: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

More generally, use any basis of products of functions

f(x1, x2, . . . , xN ) =X

n

Wn1n2···nN �n1(x1)�n2(x2) · · ·�nN (xN )

<latexit sha1_base64="HNhhTbA41UFCHwaw9hONHwLmjQY=">AAADNXicdVHdatswGJW9vzb7S9vL3YiZQQOlRKaw3QwKvdlV6GBpCpFnZFlORCTZSHLX4Ppd9hZ7ll3samO3e4VJbjraJvvAcL7znaPP0skqwY0dDr8H4YOHjx4/2druPX32/MXL/s7umSlrTdmYlqLU5xkxTHDFxpZbwc4rzYjMBJtkixM/n1wwbXipPtllxRJJZooXnBLrqLTfFvuXKTqAl2l8ALHIS2t8MxrA9xCbWqYNlsTOs6JRbQsnaaNSBFUaQ0y91sFRCyGGuJrzz37Y+vMG//rY9/HgRn5Djzw9GqT9aHg47AquA7QCEVjVaboTfMV5SWvJlKWCGDNFw8omDdGWU8HaHq4NqwhdkBmbOqiIZCZpundq4RvH5LAotfuUhR1729EQacxSZk7pL23uzzy5aTatbfEuabiqassUvV5U1ALaEvpHhznXjFqxdIBQzd2/QjonmlDroulhxb7QUkqi8gYvmG2nKGkwU6bWzO9qriKENVEzd8H2rjrTZE2NRSeN0NUGdXd8vNEAnSNywW7eRC5m/9vkjLdcLlN0P8F1cBYfIoc/HkXHJ6t0t8Ar8BrsAwTegmPwAZyCMaDgZ7Ad7AZ74bfwR/gr/H0tDYOVZw/cqfDPXxKfCAA=</latexit><latexit sha1_base64="HNhhTbA41UFCHwaw9hONHwLmjQY=">AAADNXicdVHdatswGJW9vzb7S9vL3YiZQQOlRKaw3QwKvdlV6GBpCpFnZFlORCTZSHLX4Ppd9hZ7ll3samO3e4VJbjraJvvAcL7znaPP0skqwY0dDr8H4YOHjx4/2druPX32/MXL/s7umSlrTdmYlqLU5xkxTHDFxpZbwc4rzYjMBJtkixM/n1wwbXipPtllxRJJZooXnBLrqLTfFvuXKTqAl2l8ALHIS2t8MxrA9xCbWqYNlsTOs6JRbQsnaaNSBFUaQ0y91sFRCyGGuJrzz37Y+vMG//rY9/HgRn5Djzw9GqT9aHg47AquA7QCEVjVaboTfMV5SWvJlKWCGDNFw8omDdGWU8HaHq4NqwhdkBmbOqiIZCZpundq4RvH5LAotfuUhR1729EQacxSZk7pL23uzzy5aTatbfEuabiqassUvV5U1ALaEvpHhznXjFqxdIBQzd2/QjonmlDroulhxb7QUkqi8gYvmG2nKGkwU6bWzO9qriKENVEzd8H2rjrTZE2NRSeN0NUGdXd8vNEAnSNywW7eRC5m/9vkjLdcLlN0P8F1cBYfIoc/HkXHJ6t0t8Ar8BrsAwTegmPwAZyCMaDgZ7Ad7AZ74bfwR/gr/H0tDYOVZw/cqfDPXxKfCAA=</latexit><latexit sha1_base64="HNhhTbA41UFCHwaw9hONHwLmjQY=">AAADNXicdVHdatswGJW9vzb7S9vL3YiZQQOlRKaw3QwKvdlV6GBpCpFnZFlORCTZSHLX4Ppd9hZ7ll3samO3e4VJbjraJvvAcL7znaPP0skqwY0dDr8H4YOHjx4/2druPX32/MXL/s7umSlrTdmYlqLU5xkxTHDFxpZbwc4rzYjMBJtkixM/n1wwbXipPtllxRJJZooXnBLrqLTfFvuXKTqAl2l8ALHIS2t8MxrA9xCbWqYNlsTOs6JRbQsnaaNSBFUaQ0y91sFRCyGGuJrzz37Y+vMG//rY9/HgRn5Djzw9GqT9aHg47AquA7QCEVjVaboTfMV5SWvJlKWCGDNFw8omDdGWU8HaHq4NqwhdkBmbOqiIZCZpundq4RvH5LAotfuUhR1729EQacxSZk7pL23uzzy5aTatbfEuabiqassUvV5U1ALaEvpHhznXjFqxdIBQzd2/QjonmlDroulhxb7QUkqi8gYvmG2nKGkwU6bWzO9qriKENVEzd8H2rjrTZE2NRSeN0NUGdXd8vNEAnSNywW7eRC5m/9vkjLdcLlN0P8F1cBYfIoc/HkXHJ6t0t8Ar8BrsAwTegmPwAZyCMaDgZ7Ad7AZ74bfwR/gr/H0tDYOVZw/cqfDPXxKfCAA=</latexit><latexit sha1_base64="HNhhTbA41UFCHwaw9hONHwLmjQY=">AAADNXicdVHdatswGJW9vzb7S9vL3YiZQQOlRKaw3QwKvdlV6GBpCpFnZFlORCTZSHLX4Ppd9hZ7ll3samO3e4VJbjraJvvAcL7znaPP0skqwY0dDr8H4YOHjx4/2druPX32/MXL/s7umSlrTdmYlqLU5xkxTHDFxpZbwc4rzYjMBJtkixM/n1wwbXipPtllxRJJZooXnBLrqLTfFvuXKTqAl2l8ALHIS2t8MxrA9xCbWqYNlsTOs6JRbQsnaaNSBFUaQ0y91sFRCyGGuJrzz37Y+vMG//rY9/HgRn5Djzw9GqT9aHg47AquA7QCEVjVaboTfMV5SWvJlKWCGDNFw8omDdGWU8HaHq4NqwhdkBmbOqiIZCZpundq4RvH5LAotfuUhR1729EQacxSZk7pL23uzzy5aTatbfEuabiqassUvV5U1ALaEvpHhznXjFqxdIBQzd2/QjonmlDroulhxb7QUkqi8gYvmG2nKGkwU6bWzO9qriKENVEzd8H2rjrTZE2NRSeN0NUGdXd8vNEAnSNywW7eRC5m/9vkjLdcLlN0P8F1cBYfIoc/HkXHJ6t0t8Ar8BrsAwTegmPwAZyCMaDgZ7Ad7AZ74bfwR/gr/H0tDYOVZw/cqfDPXxKfCAA=</latexit>

Novikov, Trofimov, Oseledets, arxiv:1605.03795Stoudenmire, Schwab, arxiv:1605.05775

Page 79: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

More generally, use any basis of products of functions

f(x1, x2, . . . , xN ) =X

n

Wn1n2···nN �n1(x1)�n2(x2) · · ·�nN (xN )

<latexit sha1_base64="HNhhTbA41UFCHwaw9hONHwLmjQY=">AAADNXicdVHdatswGJW9vzb7S9vL3YiZQQOlRKaw3QwKvdlV6GBpCpFnZFlORCTZSHLX4Ppd9hZ7ll3samO3e4VJbjraJvvAcL7znaPP0skqwY0dDr8H4YOHjx4/2druPX32/MXL/s7umSlrTdmYlqLU5xkxTHDFxpZbwc4rzYjMBJtkixM/n1wwbXipPtllxRJJZooXnBLrqLTfFvuXKTqAl2l8ALHIS2t8MxrA9xCbWqYNlsTOs6JRbQsnaaNSBFUaQ0y91sFRCyGGuJrzz37Y+vMG//rY9/HgRn5Djzw9GqT9aHg47AquA7QCEVjVaboTfMV5SWvJlKWCGDNFw8omDdGWU8HaHq4NqwhdkBmbOqiIZCZpundq4RvH5LAotfuUhR1729EQacxSZk7pL23uzzy5aTatbfEuabiqassUvV5U1ALaEvpHhznXjFqxdIBQzd2/QjonmlDroulhxb7QUkqi8gYvmG2nKGkwU6bWzO9qriKENVEzd8H2rjrTZE2NRSeN0NUGdXd8vNEAnSNywW7eRC5m/9vkjLdcLlN0P8F1cBYfIoc/HkXHJ6t0t8Ar8BrsAwTegmPwAZyCMaDgZ7Ad7AZ74bfwR/gr/H0tDYOVZw/cqfDPXxKfCAA=</latexit><latexit sha1_base64="HNhhTbA41UFCHwaw9hONHwLmjQY=">AAADNXicdVHdatswGJW9vzb7S9vL3YiZQQOlRKaw3QwKvdlV6GBpCpFnZFlORCTZSHLX4Ppd9hZ7ll3samO3e4VJbjraJvvAcL7znaPP0skqwY0dDr8H4YOHjx4/2druPX32/MXL/s7umSlrTdmYlqLU5xkxTHDFxpZbwc4rzYjMBJtkixM/n1wwbXipPtllxRJJZooXnBLrqLTfFvuXKTqAl2l8ALHIS2t8MxrA9xCbWqYNlsTOs6JRbQsnaaNSBFUaQ0y91sFRCyGGuJrzz37Y+vMG//rY9/HgRn5Djzw9GqT9aHg47AquA7QCEVjVaboTfMV5SWvJlKWCGDNFw8omDdGWU8HaHq4NqwhdkBmbOqiIZCZpundq4RvH5LAotfuUhR1729EQacxSZk7pL23uzzy5aTatbfEuabiqassUvV5U1ALaEvpHhznXjFqxdIBQzd2/QjonmlDroulhxb7QUkqi8gYvmG2nKGkwU6bWzO9qriKENVEzd8H2rjrTZE2NRSeN0NUGdXd8vNEAnSNywW7eRC5m/9vkjLdcLlN0P8F1cBYfIoc/HkXHJ6t0t8Ar8BrsAwTegmPwAZyCMaDgZ7Ad7AZ74bfwR/gr/H0tDYOVZw/cqfDPXxKfCAA=</latexit><latexit sha1_base64="HNhhTbA41UFCHwaw9hONHwLmjQY=">AAADNXicdVHdatswGJW9vzb7S9vL3YiZQQOlRKaw3QwKvdlV6GBpCpFnZFlORCTZSHLX4Ppd9hZ7ll3samO3e4VJbjraJvvAcL7znaPP0skqwY0dDr8H4YOHjx4/2druPX32/MXL/s7umSlrTdmYlqLU5xkxTHDFxpZbwc4rzYjMBJtkixM/n1wwbXipPtllxRJJZooXnBLrqLTfFvuXKTqAl2l8ALHIS2t8MxrA9xCbWqYNlsTOs6JRbQsnaaNSBFUaQ0y91sFRCyGGuJrzz37Y+vMG//rY9/HgRn5Djzw9GqT9aHg47AquA7QCEVjVaboTfMV5SWvJlKWCGDNFw8omDdGWU8HaHq4NqwhdkBmbOqiIZCZpundq4RvH5LAotfuUhR1729EQacxSZk7pL23uzzy5aTatbfEuabiqassUvV5U1ALaEvpHhznXjFqxdIBQzd2/QjonmlDroulhxb7QUkqi8gYvmG2nKGkwU6bWzO9qriKENVEzd8H2rjrTZE2NRSeN0NUGdXd8vNEAnSNywW7eRC5m/9vkjLdcLlN0P8F1cBYfIoc/HkXHJ6t0t8Ar8BrsAwTegmPwAZyCMaDgZ7Ad7AZ74bfwR/gr/H0tDYOVZw/cqfDPXxKfCAA=</latexit><latexit sha1_base64="HNhhTbA41UFCHwaw9hONHwLmjQY=">AAADNXicdVHdatswGJW9vzb7S9vL3YiZQQOlRKaw3QwKvdlV6GBpCpFnZFlORCTZSHLX4Ppd9hZ7ll3samO3e4VJbjraJvvAcL7znaPP0skqwY0dDr8H4YOHjx4/2druPX32/MXL/s7umSlrTdmYlqLU5xkxTHDFxpZbwc4rzYjMBJtkixM/n1wwbXipPtllxRJJZooXnBLrqLTfFvuXKTqAl2l8ALHIS2t8MxrA9xCbWqYNlsTOs6JRbQsnaaNSBFUaQ0y91sFRCyGGuJrzz37Y+vMG//rY9/HgRn5Djzw9GqT9aHg47AquA7QCEVjVaboTfMV5SWvJlKWCGDNFw8omDdGWU8HaHq4NqwhdkBmbOqiIZCZpundq4RvH5LAotfuUhR1729EQacxSZk7pL23uzzy5aTatbfEuabiqassUvV5U1ALaEvpHhznXjFqxdIBQzd2/QjonmlDroulhxb7QUkqi8gYvmG2nKGkwU6bWzO9qriKENVEzd8H2rjrTZE2NRSeN0NUGdXd8vNEAnSNywW7eRC5m/9vkjLdcLlN0P8F1cBYfIoc/HkXHJ6t0t8Ar8BrsAwTegmPwAZyCMaDgZ7Ad7AZ74bfwR/gr/H0tDYOVZw/cqfDPXxKfCAA=</latexit>

[<latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit><latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit><latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit><latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit>

1<latexit sha1_base64="bpdKPLeduJmhaJFyHB/OJ/FYsFs=">AAACt3icdVFNSysxFE3Hj6f1W5dugoPgSiZFsO6Et3GpYFXoDOVOeqcNTTJjklHK2F/g9r29f8t/Yzp2obZeCBzOPSfnJjctpLAuit4bwdLyyuqftfXmxubW9s7u3v6dzUvDscNzmZuHFCxKobHjhJP4UBgElUq8T0d/p/37JzRW5PrWjQtMFAy0yAQH56kb1tsNo9OoLjoP2AyEZFbXvb3GW9zPealQOy7B2i6LCpdUYJzgEifNuLRYAB/BALsealBok6qedEKPPdOnWW780Y7W7FdHBcrasUq9UoEb2p+9Kbmo1y1d1k4qoYvSoeafQVkpqcvp9Nm0LwxyJ8ceADfCz0r5EAxw5z+nGWt85rlSoPtVPEI36bKkilHb0uA0q3oJWWxAD/wDJ9/VqYE5dSxracheFqjr61sLDdQ7whb9JQmeBr8leeMXl98p+7nBeXDXOmUe35yFl+3ZdtfIITkiJ4SRc3JJrsg16RBOkLySf+R/cBH0giwYfkqDxsxzQL5V8PgBBe7bkQ==</latexit><latexit sha1_base64="bpdKPLeduJmhaJFyHB/OJ/FYsFs=">AAACt3icdVFNSysxFE3Hj6f1W5dugoPgSiZFsO6Et3GpYFXoDOVOeqcNTTJjklHK2F/g9r29f8t/Yzp2obZeCBzOPSfnJjctpLAuit4bwdLyyuqftfXmxubW9s7u3v6dzUvDscNzmZuHFCxKobHjhJP4UBgElUq8T0d/p/37JzRW5PrWjQtMFAy0yAQH56kb1tsNo9OoLjoP2AyEZFbXvb3GW9zPealQOy7B2i6LCpdUYJzgEifNuLRYAB/BALsealBok6qedEKPPdOnWW780Y7W7FdHBcrasUq9UoEb2p+9Kbmo1y1d1k4qoYvSoeafQVkpqcvp9Nm0LwxyJ8ceADfCz0r5EAxw5z+nGWt85rlSoPtVPEI36bKkilHb0uA0q3oJWWxAD/wDJ9/VqYE5dSxracheFqjr61sLDdQ7whb9JQmeBr8leeMXl98p+7nBeXDXOmUe35yFl+3ZdtfIITkiJ4SRc3JJrsg16RBOkLySf+R/cBH0giwYfkqDxsxzQL5V8PgBBe7bkQ==</latexit><latexit sha1_base64="bpdKPLeduJmhaJFyHB/OJ/FYsFs=">AAACt3icdVFNSysxFE3Hj6f1W5dugoPgSiZFsO6Et3GpYFXoDOVOeqcNTTJjklHK2F/g9r29f8t/Yzp2obZeCBzOPSfnJjctpLAuit4bwdLyyuqftfXmxubW9s7u3v6dzUvDscNzmZuHFCxKobHjhJP4UBgElUq8T0d/p/37JzRW5PrWjQtMFAy0yAQH56kb1tsNo9OoLjoP2AyEZFbXvb3GW9zPealQOy7B2i6LCpdUYJzgEifNuLRYAB/BALsealBok6qedEKPPdOnWW780Y7W7FdHBcrasUq9UoEb2p+9Kbmo1y1d1k4qoYvSoeafQVkpqcvp9Nm0LwxyJ8ceADfCz0r5EAxw5z+nGWt85rlSoPtVPEI36bKkilHb0uA0q3oJWWxAD/wDJ9/VqYE5dSxracheFqjr61sLDdQ7whb9JQmeBr8leeMXl98p+7nBeXDXOmUe35yFl+3ZdtfIITkiJ4SRc3JJrsg16RBOkLySf+R/cBH0giwYfkqDxsxzQL5V8PgBBe7bkQ==</latexit><latexit sha1_base64="bpdKPLeduJmhaJFyHB/OJ/FYsFs=">AAACt3icdVFNSysxFE3Hj6f1W5dugoPgSiZFsO6Et3GpYFXoDOVOeqcNTTJjklHK2F/g9r29f8t/Yzp2obZeCBzOPSfnJjctpLAuit4bwdLyyuqftfXmxubW9s7u3v6dzUvDscNzmZuHFCxKobHjhJP4UBgElUq8T0d/p/37JzRW5PrWjQtMFAy0yAQH56kb1tsNo9OoLjoP2AyEZFbXvb3GW9zPealQOy7B2i6LCpdUYJzgEifNuLRYAB/BALsealBok6qedEKPPdOnWW780Y7W7FdHBcrasUq9UoEb2p+9Kbmo1y1d1k4qoYvSoeafQVkpqcvp9Nm0LwxyJ8ceADfCz0r5EAxw5z+nGWt85rlSoPtVPEI36bKkilHb0uA0q3oJWWxAD/wDJ9/VqYE5dSxracheFqjr61sLDdQ7whb9JQmeBr8leeMXl98p+7nBeXDXOmUe35yFl+3ZdtfIITkiJ4SRc3JJrsg16RBOkLySf+R/cBH0giwYfkqDxsxzQL5V8PgBBe7bkQ==</latexit>

x<latexit sha1_base64="hwPOV1T7fLJqhR17zXZxmGCQBv0=">AAACt3icdVFNSwMxEE3X7/qtRy/BRfAku0VQwYPgxaOCVaG7lNl0tg1NsmuSrZa1v8Cr3v1b/hvTtQe1dSDwePNe3iST5IIbGwSfNW9ufmFxaXmlvrq2vrG5tb1zZ7JCM2yyTGT6IQGDgitsWm4FPuQaQSYC75P+5bh/P0BteKZu7TDHWEJX8ZQzsI66eW5v+cFRUBWdBuEE+GRS1+3t2kfUyVghUVkmwJhWGOQ2LkFbzgSO6lFhMAfWhy62HFQg0cRlNemIHjimQ9NMu6MsrdifjhKkMUOZOKUE2zN/e2NyVq9V2PQ0LrnKC4uKfQelhaA2o+Nn0w7XyKwYOgBMczcrZT3QwKz7nHqk8IllUoLqlFEf7agVxmWEyhQax1nlix9GGlTXPXD0W51omFJHopL64csMdXV9Y6aBOoffoP8kwaD7X5Iz/nC5nYZ/NzgN7hpHocM3x/7F+WS7y2SP7JNDEpITckGuyDVpEkaQvJI38u6deW0v9XrfUq828eySX+U9fgGsJ9vc</latexit><latexit sha1_base64="hwPOV1T7fLJqhR17zXZxmGCQBv0=">AAACt3icdVFNSwMxEE3X7/qtRy/BRfAku0VQwYPgxaOCVaG7lNl0tg1NsmuSrZa1v8Cr3v1b/hvTtQe1dSDwePNe3iST5IIbGwSfNW9ufmFxaXmlvrq2vrG5tb1zZ7JCM2yyTGT6IQGDgitsWm4FPuQaQSYC75P+5bh/P0BteKZu7TDHWEJX8ZQzsI66eW5v+cFRUBWdBuEE+GRS1+3t2kfUyVghUVkmwJhWGOQ2LkFbzgSO6lFhMAfWhy62HFQg0cRlNemIHjimQ9NMu6MsrdifjhKkMUOZOKUE2zN/e2NyVq9V2PQ0LrnKC4uKfQelhaA2o+Nn0w7XyKwYOgBMczcrZT3QwKz7nHqk8IllUoLqlFEf7agVxmWEyhQax1nlix9GGlTXPXD0W51omFJHopL64csMdXV9Y6aBOoffoP8kwaD7X5Iz/nC5nYZ/NzgN7hpHocM3x/7F+WS7y2SP7JNDEpITckGuyDVpEkaQvJI38u6deW0v9XrfUq828eySX+U9fgGsJ9vc</latexit><latexit sha1_base64="hwPOV1T7fLJqhR17zXZxmGCQBv0=">AAACt3icdVFNSwMxEE3X7/qtRy/BRfAku0VQwYPgxaOCVaG7lNl0tg1NsmuSrZa1v8Cr3v1b/hvTtQe1dSDwePNe3iST5IIbGwSfNW9ufmFxaXmlvrq2vrG5tb1zZ7JCM2yyTGT6IQGDgitsWm4FPuQaQSYC75P+5bh/P0BteKZu7TDHWEJX8ZQzsI66eW5v+cFRUBWdBuEE+GRS1+3t2kfUyVghUVkmwJhWGOQ2LkFbzgSO6lFhMAfWhy62HFQg0cRlNemIHjimQ9NMu6MsrdifjhKkMUOZOKUE2zN/e2NyVq9V2PQ0LrnKC4uKfQelhaA2o+Nn0w7XyKwYOgBMczcrZT3QwKz7nHqk8IllUoLqlFEf7agVxmWEyhQax1nlix9GGlTXPXD0W51omFJHopL64csMdXV9Y6aBOoffoP8kwaD7X5Iz/nC5nYZ/NzgN7hpHocM3x/7F+WS7y2SP7JNDEpITckGuyDVpEkaQvJI38u6deW0v9XrfUq828eySX+U9fgGsJ9vc</latexit><latexit sha1_base64="hwPOV1T7fLJqhR17zXZxmGCQBv0=">AAACt3icdVFNSwMxEE3X7/qtRy/BRfAku0VQwYPgxaOCVaG7lNl0tg1NsmuSrZa1v8Cr3v1b/hvTtQe1dSDwePNe3iST5IIbGwSfNW9ufmFxaXmlvrq2vrG5tb1zZ7JCM2yyTGT6IQGDgitsWm4FPuQaQSYC75P+5bh/P0BteKZu7TDHWEJX8ZQzsI66eW5v+cFRUBWdBuEE+GRS1+3t2kfUyVghUVkmwJhWGOQ2LkFbzgSO6lFhMAfWhy62HFQg0cRlNemIHjimQ9NMu6MsrdifjhKkMUOZOKUE2zN/e2NyVq9V2PQ0LrnKC4uKfQelhaA2o+Nn0w7XyKwYOgBMczcrZT3QwKz7nHqk8IllUoLqlFEf7agVxmWEyhQax1nlix9GGlTXPXD0W51omFJHopL64csMdXV9Y6aBOoffoP8kwaD7X5Iz/nC5nYZ/NzgN7hpHocM3x/7F+WS7y2SP7JNDEpITckGuyDVpEkaQvJI38u6deW0v9XrfUq828eySX+U9fgGsJ9vc</latexit>

[<latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit> <latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit> <latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit> <latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit>

gives polynomials�(x) =<latexit sha1_base64="VbdH8fx5LxAqnAJvb1FUbnd9sOc=">AAACy3icdVHBahsxEJW3SZu6TeMkx15ElkJ6CbsmkFwKgVx6CaRQJwFrMbPyrC0saRdJ69Zd77Hf0R7bT+rfVLvxIYmdAcHjzXt6I01aSGFdFP3rBC+2tl++2nndffN2991eb//gxual4TjguczNXQoWpdA4cMJJvCsMgkol3qazy6Z/O0djRa6/ukWBiYKJFpng4Dw16h0yBW6aZhUrpqI+/v6RfqKjXhidRG3RdRCvQEhWdT3a7/xm45yXCrXjEqwdxlHhkgqME1xi3WWlxQL4DCY49FCDQptU7fQ1/eCZMc1y4492tGUfOipQ1i5U6pXNrPZpryE39Yaly86TSuiidKj5fVBWSupy2nwFHQuD3MmFB8CN8LNSPgUD3PkP6zKN33iuFOhxxWbo6mGcVAy1LQ02WdUyjJkBPfEPrB+rUwNraiZbaRgvN6jb6/sbDdQ7wj59Jgnmk+eSvPGBy+80frrBdXDTP4k9/nIaXlyutrtD3pMjckxickYuyGdyTQaEkwX5Rf6Qv8FVYIMfwfJeGnRWnkPyqIKf/wGPvuL5</latexit><latexit sha1_base64="VbdH8fx5LxAqnAJvb1FUbnd9sOc=">AAACy3icdVHBahsxEJW3SZu6TeMkx15ElkJ6CbsmkFwKgVx6CaRQJwFrMbPyrC0saRdJ69Zd77Hf0R7bT+rfVLvxIYmdAcHjzXt6I01aSGFdFP3rBC+2tl++2nndffN2991eb//gxual4TjguczNXQoWpdA4cMJJvCsMgkol3qazy6Z/O0djRa6/ukWBiYKJFpng4Dw16h0yBW6aZhUrpqI+/v6RfqKjXhidRG3RdRCvQEhWdT3a7/xm45yXCrXjEqwdxlHhkgqME1xi3WWlxQL4DCY49FCDQptU7fQ1/eCZMc1y4492tGUfOipQ1i5U6pXNrPZpryE39Yaly86TSuiidKj5fVBWSupy2nwFHQuD3MmFB8CN8LNSPgUD3PkP6zKN33iuFOhxxWbo6mGcVAy1LQ02WdUyjJkBPfEPrB+rUwNraiZbaRgvN6jb6/sbDdQ7wj59Jgnmk+eSvPGBy+80frrBdXDTP4k9/nIaXlyutrtD3pMjckxickYuyGdyTQaEkwX5Rf6Qv8FVYIMfwfJeGnRWnkPyqIKf/wGPvuL5</latexit><latexit sha1_base64="VbdH8fx5LxAqnAJvb1FUbnd9sOc=">AAACy3icdVHBahsxEJW3SZu6TeMkx15ElkJ6CbsmkFwKgVx6CaRQJwFrMbPyrC0saRdJ69Zd77Hf0R7bT+rfVLvxIYmdAcHjzXt6I01aSGFdFP3rBC+2tl++2nndffN2991eb//gxual4TjguczNXQoWpdA4cMJJvCsMgkol3qazy6Z/O0djRa6/ukWBiYKJFpng4Dw16h0yBW6aZhUrpqI+/v6RfqKjXhidRG3RdRCvQEhWdT3a7/xm45yXCrXjEqwdxlHhkgqME1xi3WWlxQL4DCY49FCDQptU7fQ1/eCZMc1y4492tGUfOipQ1i5U6pXNrPZpryE39Yaly86TSuiidKj5fVBWSupy2nwFHQuD3MmFB8CN8LNSPgUD3PkP6zKN33iuFOhxxWbo6mGcVAy1LQ02WdUyjJkBPfEPrB+rUwNraiZbaRgvN6jb6/sbDdQ7wj59Jgnmk+eSvPGBy+80frrBdXDTP4k9/nIaXlyutrtD3pMjckxickYuyGdyTQaEkwX5Rf6Qv8FVYIMfwfJeGnRWnkPyqIKf/wGPvuL5</latexit><latexit sha1_base64="VbdH8fx5LxAqnAJvb1FUbnd9sOc=">AAACy3icdVHBahsxEJW3SZu6TeMkx15ElkJ6CbsmkFwKgVx6CaRQJwFrMbPyrC0saRdJ69Zd77Hf0R7bT+rfVLvxIYmdAcHjzXt6I01aSGFdFP3rBC+2tl++2nndffN2991eb//gxual4TjguczNXQoWpdA4cMJJvCsMgkol3qazy6Z/O0djRa6/ukWBiYKJFpng4Dw16h0yBW6aZhUrpqI+/v6RfqKjXhidRG3RdRCvQEhWdT3a7/xm45yXCrXjEqwdxlHhkgqME1xi3WWlxQL4DCY49FCDQptU7fQ1/eCZMc1y4492tGUfOipQ1i5U6pXNrPZpryE39Yaly86TSuiidKj5fVBWSupy2nwFHQuD3MmFB8CN8LNSPgUD3PkP6zKN33iuFOhxxWbo6mGcVAy1LQ02WdUyjJkBPfEPrB+rUwNraiZbaRgvN6jb6/sbDdQ7wj59Jgnmk+eSvPGBy+80frrBdXDTP4k9/nIaXlyutrtD3pMjckxickYuyGdyTQaEkwX5Rf6Qv8FVYIMfwfJeGnRWnkPyqIKf/wGPvuL5</latexit>

Novikov, Trofimov, Oseledets, arxiv:1605.03795Stoudenmire, Schwab, arxiv:1605.05775

Page 80: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

More generally, use any basis of products of functions

f(x1, x2, . . . , xN ) =X

n

Wn1n2···nN �n1(x1)�n2(x2) · · ·�nN (xN )

<latexit sha1_base64="HNhhTbA41UFCHwaw9hONHwLmjQY=">AAADNXicdVHdatswGJW9vzb7S9vL3YiZQQOlRKaw3QwKvdlV6GBpCpFnZFlORCTZSHLX4Ppd9hZ7ll3samO3e4VJbjraJvvAcL7znaPP0skqwY0dDr8H4YOHjx4/2druPX32/MXL/s7umSlrTdmYlqLU5xkxTHDFxpZbwc4rzYjMBJtkixM/n1wwbXipPtllxRJJZooXnBLrqLTfFvuXKTqAl2l8ALHIS2t8MxrA9xCbWqYNlsTOs6JRbQsnaaNSBFUaQ0y91sFRCyGGuJrzz37Y+vMG//rY9/HgRn5Djzw9GqT9aHg47AquA7QCEVjVaboTfMV5SWvJlKWCGDNFw8omDdGWU8HaHq4NqwhdkBmbOqiIZCZpundq4RvH5LAotfuUhR1729EQacxSZk7pL23uzzy5aTatbfEuabiqassUvV5U1ALaEvpHhznXjFqxdIBQzd2/QjonmlDroulhxb7QUkqi8gYvmG2nKGkwU6bWzO9qriKENVEzd8H2rjrTZE2NRSeN0NUGdXd8vNEAnSNywW7eRC5m/9vkjLdcLlN0P8F1cBYfIoc/HkXHJ6t0t8Ar8BrsAwTegmPwAZyCMaDgZ7Ad7AZ74bfwR/gr/H0tDYOVZw/cqfDPXxKfCAA=</latexit><latexit sha1_base64="HNhhTbA41UFCHwaw9hONHwLmjQY=">AAADNXicdVHdatswGJW9vzb7S9vL3YiZQQOlRKaw3QwKvdlV6GBpCpFnZFlORCTZSHLX4Ppd9hZ7ll3samO3e4VJbjraJvvAcL7znaPP0skqwY0dDr8H4YOHjx4/2druPX32/MXL/s7umSlrTdmYlqLU5xkxTHDFxpZbwc4rzYjMBJtkixM/n1wwbXipPtllxRJJZooXnBLrqLTfFvuXKTqAl2l8ALHIS2t8MxrA9xCbWqYNlsTOs6JRbQsnaaNSBFUaQ0y91sFRCyGGuJrzz37Y+vMG//rY9/HgRn5Djzw9GqT9aHg47AquA7QCEVjVaboTfMV5SWvJlKWCGDNFw8omDdGWU8HaHq4NqwhdkBmbOqiIZCZpundq4RvH5LAotfuUhR1729EQacxSZk7pL23uzzy5aTatbfEuabiqassUvV5U1ALaEvpHhznXjFqxdIBQzd2/QjonmlDroulhxb7QUkqi8gYvmG2nKGkwU6bWzO9qriKENVEzd8H2rjrTZE2NRSeN0NUGdXd8vNEAnSNywW7eRC5m/9vkjLdcLlN0P8F1cBYfIoc/HkXHJ6t0t8Ar8BrsAwTegmPwAZyCMaDgZ7Ad7AZ74bfwR/gr/H0tDYOVZw/cqfDPXxKfCAA=</latexit><latexit sha1_base64="HNhhTbA41UFCHwaw9hONHwLmjQY=">AAADNXicdVHdatswGJW9vzb7S9vL3YiZQQOlRKaw3QwKvdlV6GBpCpFnZFlORCTZSHLX4Ppd9hZ7ll3samO3e4VJbjraJvvAcL7znaPP0skqwY0dDr8H4YOHjx4/2druPX32/MXL/s7umSlrTdmYlqLU5xkxTHDFxpZbwc4rzYjMBJtkixM/n1wwbXipPtllxRJJZooXnBLrqLTfFvuXKTqAl2l8ALHIS2t8MxrA9xCbWqYNlsTOs6JRbQsnaaNSBFUaQ0y91sFRCyGGuJrzz37Y+vMG//rY9/HgRn5Djzw9GqT9aHg47AquA7QCEVjVaboTfMV5SWvJlKWCGDNFw8omDdGWU8HaHq4NqwhdkBmbOqiIZCZpundq4RvH5LAotfuUhR1729EQacxSZk7pL23uzzy5aTatbfEuabiqassUvV5U1ALaEvpHhznXjFqxdIBQzd2/QjonmlDroulhxb7QUkqi8gYvmG2nKGkwU6bWzO9qriKENVEzd8H2rjrTZE2NRSeN0NUGdXd8vNEAnSNywW7eRC5m/9vkjLdcLlN0P8F1cBYfIoc/HkXHJ6t0t8Ar8BrsAwTegmPwAZyCMaDgZ7Ad7AZ74bfwR/gr/H0tDYOVZw/cqfDPXxKfCAA=</latexit><latexit sha1_base64="HNhhTbA41UFCHwaw9hONHwLmjQY=">AAADNXicdVHdatswGJW9vzb7S9vL3YiZQQOlRKaw3QwKvdlV6GBpCpFnZFlORCTZSHLX4Ppd9hZ7ll3samO3e4VJbjraJvvAcL7znaPP0skqwY0dDr8H4YOHjx4/2druPX32/MXL/s7umSlrTdmYlqLU5xkxTHDFxpZbwc4rzYjMBJtkixM/n1wwbXipPtllxRJJZooXnBLrqLTfFvuXKTqAl2l8ALHIS2t8MxrA9xCbWqYNlsTOs6JRbQsnaaNSBFUaQ0y91sFRCyGGuJrzz37Y+vMG//rY9/HgRn5Djzw9GqT9aHg47AquA7QCEVjVaboTfMV5SWvJlKWCGDNFw8omDdGWU8HaHq4NqwhdkBmbOqiIZCZpundq4RvH5LAotfuUhR1729EQacxSZk7pL23uzzy5aTatbfEuabiqassUvV5U1ALaEvpHhznXjFqxdIBQzd2/QjonmlDroulhxb7QUkqi8gYvmG2nKGkwU6bWzO9qriKENVEzd8H2rjrTZE2NRSeN0NUGdXd8vNEAnSNywW7eRC5m/9vkjLdcLlN0P8F1cBYfIoc/HkXHJ6t0t8Ar8BrsAwTegmPwAZyCMaDgZ7Ad7AZ74bfwR/gr/H0tDYOVZw/cqfDPXxKfCAA=</latexit>

[<latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit><latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit><latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit><latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit>

1<latexit sha1_base64="bpdKPLeduJmhaJFyHB/OJ/FYsFs=">AAACt3icdVFNSysxFE3Hj6f1W5dugoPgSiZFsO6Et3GpYFXoDOVOeqcNTTJjklHK2F/g9r29f8t/Yzp2obZeCBzOPSfnJjctpLAuit4bwdLyyuqftfXmxubW9s7u3v6dzUvDscNzmZuHFCxKobHjhJP4UBgElUq8T0d/p/37JzRW5PrWjQtMFAy0yAQH56kb1tsNo9OoLjoP2AyEZFbXvb3GW9zPealQOy7B2i6LCpdUYJzgEifNuLRYAB/BALsealBok6qedEKPPdOnWW780Y7W7FdHBcrasUq9UoEb2p+9Kbmo1y1d1k4qoYvSoeafQVkpqcvp9Nm0LwxyJ8ceADfCz0r5EAxw5z+nGWt85rlSoPtVPEI36bKkilHb0uA0q3oJWWxAD/wDJ9/VqYE5dSxracheFqjr61sLDdQ7whb9JQmeBr8leeMXl98p+7nBeXDXOmUe35yFl+3ZdtfIITkiJ4SRc3JJrsg16RBOkLySf+R/cBH0giwYfkqDxsxzQL5V8PgBBe7bkQ==</latexit><latexit sha1_base64="bpdKPLeduJmhaJFyHB/OJ/FYsFs=">AAACt3icdVFNSysxFE3Hj6f1W5dugoPgSiZFsO6Et3GpYFXoDOVOeqcNTTJjklHK2F/g9r29f8t/Yzp2obZeCBzOPSfnJjctpLAuit4bwdLyyuqftfXmxubW9s7u3v6dzUvDscNzmZuHFCxKobHjhJP4UBgElUq8T0d/p/37JzRW5PrWjQtMFAy0yAQH56kb1tsNo9OoLjoP2AyEZFbXvb3GW9zPealQOy7B2i6LCpdUYJzgEifNuLRYAB/BALsealBok6qedEKPPdOnWW780Y7W7FdHBcrasUq9UoEb2p+9Kbmo1y1d1k4qoYvSoeafQVkpqcvp9Nm0LwxyJ8ceADfCz0r5EAxw5z+nGWt85rlSoPtVPEI36bKkilHb0uA0q3oJWWxAD/wDJ9/VqYE5dSxracheFqjr61sLDdQ7whb9JQmeBr8leeMXl98p+7nBeXDXOmUe35yFl+3ZdtfIITkiJ4SRc3JJrsg16RBOkLySf+R/cBH0giwYfkqDxsxzQL5V8PgBBe7bkQ==</latexit><latexit sha1_base64="bpdKPLeduJmhaJFyHB/OJ/FYsFs=">AAACt3icdVFNSysxFE3Hj6f1W5dugoPgSiZFsO6Et3GpYFXoDOVOeqcNTTJjklHK2F/g9r29f8t/Yzp2obZeCBzOPSfnJjctpLAuit4bwdLyyuqftfXmxubW9s7u3v6dzUvDscNzmZuHFCxKobHjhJP4UBgElUq8T0d/p/37JzRW5PrWjQtMFAy0yAQH56kb1tsNo9OoLjoP2AyEZFbXvb3GW9zPealQOy7B2i6LCpdUYJzgEifNuLRYAB/BALsealBok6qedEKPPdOnWW780Y7W7FdHBcrasUq9UoEb2p+9Kbmo1y1d1k4qoYvSoeafQVkpqcvp9Nm0LwxyJ8ceADfCz0r5EAxw5z+nGWt85rlSoPtVPEI36bKkilHb0uA0q3oJWWxAD/wDJ9/VqYE5dSxracheFqjr61sLDdQ7whb9JQmeBr8leeMXl98p+7nBeXDXOmUe35yFl+3ZdtfIITkiJ4SRc3JJrsg16RBOkLySf+R/cBH0giwYfkqDxsxzQL5V8PgBBe7bkQ==</latexit><latexit sha1_base64="bpdKPLeduJmhaJFyHB/OJ/FYsFs=">AAACt3icdVFNSysxFE3Hj6f1W5dugoPgSiZFsO6Et3GpYFXoDOVOeqcNTTJjklHK2F/g9r29f8t/Yzp2obZeCBzOPSfnJjctpLAuit4bwdLyyuqftfXmxubW9s7u3v6dzUvDscNzmZuHFCxKobHjhJP4UBgElUq8T0d/p/37JzRW5PrWjQtMFAy0yAQH56kb1tsNo9OoLjoP2AyEZFbXvb3GW9zPealQOy7B2i6LCpdUYJzgEifNuLRYAB/BALsealBok6qedEKPPdOnWW780Y7W7FdHBcrasUq9UoEb2p+9Kbmo1y1d1k4qoYvSoeafQVkpqcvp9Nm0LwxyJ8ceADfCz0r5EAxw5z+nGWt85rlSoPtVPEI36bKkilHb0uA0q3oJWWxAD/wDJ9/VqYE5dSxracheFqjr61sLDdQ7whb9JQmeBr8leeMXl98p+7nBeXDXOmUe35yFl+3ZdtfIITkiJ4SRc3JJrsg16RBOkLySf+R/cBH0giwYfkqDxsxzQL5V8PgBBe7bkQ==</latexit>

x<latexit sha1_base64="hwPOV1T7fLJqhR17zXZxmGCQBv0=">AAACt3icdVFNSwMxEE3X7/qtRy/BRfAku0VQwYPgxaOCVaG7lNl0tg1NsmuSrZa1v8Cr3v1b/hvTtQe1dSDwePNe3iST5IIbGwSfNW9ufmFxaXmlvrq2vrG5tb1zZ7JCM2yyTGT6IQGDgitsWm4FPuQaQSYC75P+5bh/P0BteKZu7TDHWEJX8ZQzsI66eW5v+cFRUBWdBuEE+GRS1+3t2kfUyVghUVkmwJhWGOQ2LkFbzgSO6lFhMAfWhy62HFQg0cRlNemIHjimQ9NMu6MsrdifjhKkMUOZOKUE2zN/e2NyVq9V2PQ0LrnKC4uKfQelhaA2o+Nn0w7XyKwYOgBMczcrZT3QwKz7nHqk8IllUoLqlFEf7agVxmWEyhQax1nlix9GGlTXPXD0W51omFJHopL64csMdXV9Y6aBOoffoP8kwaD7X5Iz/nC5nYZ/NzgN7hpHocM3x/7F+WS7y2SP7JNDEpITckGuyDVpEkaQvJI38u6deW0v9XrfUq828eySX+U9fgGsJ9vc</latexit><latexit sha1_base64="hwPOV1T7fLJqhR17zXZxmGCQBv0=">AAACt3icdVFNSwMxEE3X7/qtRy/BRfAku0VQwYPgxaOCVaG7lNl0tg1NsmuSrZa1v8Cr3v1b/hvTtQe1dSDwePNe3iST5IIbGwSfNW9ufmFxaXmlvrq2vrG5tb1zZ7JCM2yyTGT6IQGDgitsWm4FPuQaQSYC75P+5bh/P0BteKZu7TDHWEJX8ZQzsI66eW5v+cFRUBWdBuEE+GRS1+3t2kfUyVghUVkmwJhWGOQ2LkFbzgSO6lFhMAfWhy62HFQg0cRlNemIHjimQ9NMu6MsrdifjhKkMUOZOKUE2zN/e2NyVq9V2PQ0LrnKC4uKfQelhaA2o+Nn0w7XyKwYOgBMczcrZT3QwKz7nHqk8IllUoLqlFEf7agVxmWEyhQax1nlix9GGlTXPXD0W51omFJHopL64csMdXV9Y6aBOoffoP8kwaD7X5Iz/nC5nYZ/NzgN7hpHocM3x/7F+WS7y2SP7JNDEpITckGuyDVpEkaQvJI38u6deW0v9XrfUq828eySX+U9fgGsJ9vc</latexit><latexit sha1_base64="hwPOV1T7fLJqhR17zXZxmGCQBv0=">AAACt3icdVFNSwMxEE3X7/qtRy/BRfAku0VQwYPgxaOCVaG7lNl0tg1NsmuSrZa1v8Cr3v1b/hvTtQe1dSDwePNe3iST5IIbGwSfNW9ufmFxaXmlvrq2vrG5tb1zZ7JCM2yyTGT6IQGDgitsWm4FPuQaQSYC75P+5bh/P0BteKZu7TDHWEJX8ZQzsI66eW5v+cFRUBWdBuEE+GRS1+3t2kfUyVghUVkmwJhWGOQ2LkFbzgSO6lFhMAfWhy62HFQg0cRlNemIHjimQ9NMu6MsrdifjhKkMUOZOKUE2zN/e2NyVq9V2PQ0LrnKC4uKfQelhaA2o+Nn0w7XyKwYOgBMczcrZT3QwKz7nHqk8IllUoLqlFEf7agVxmWEyhQax1nlix9GGlTXPXD0W51omFJHopL64csMdXV9Y6aBOoffoP8kwaD7X5Iz/nC5nYZ/NzgN7hpHocM3x/7F+WS7y2SP7JNDEpITckGuyDVpEkaQvJI38u6deW0v9XrfUq828eySX+U9fgGsJ9vc</latexit><latexit sha1_base64="hwPOV1T7fLJqhR17zXZxmGCQBv0=">AAACt3icdVFNSwMxEE3X7/qtRy/BRfAku0VQwYPgxaOCVaG7lNl0tg1NsmuSrZa1v8Cr3v1b/hvTtQe1dSDwePNe3iST5IIbGwSfNW9ufmFxaXmlvrq2vrG5tb1zZ7JCM2yyTGT6IQGDgitsWm4FPuQaQSYC75P+5bh/P0BteKZu7TDHWEJX8ZQzsI66eW5v+cFRUBWdBuEE+GRS1+3t2kfUyVghUVkmwJhWGOQ2LkFbzgSO6lFhMAfWhy62HFQg0cRlNemIHjimQ9NMu6MsrdifjhKkMUOZOKUE2zN/e2NyVq9V2PQ0LrnKC4uKfQelhaA2o+Nn0w7XyKwYOgBMczcrZT3QwKz7nHqk8IllUoLqlFEf7agVxmWEyhQax1nlix9GGlTXPXD0W51omFJHopL64csMdXV9Y6aBOoffoP8kwaD7X5Iz/nC5nYZ/NzgN7hpHocM3x/7F+WS7y2SP7JNDEpITckGuyDVpEkaQvJI38u6deW0v9XrfUq828eySX+U9fgGsJ9vc</latexit>

[<latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit> <latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit> <latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit> <latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit>

gives polynomials�(x) =<latexit sha1_base64="VbdH8fx5LxAqnAJvb1FUbnd9sOc=">AAACy3icdVHBahsxEJW3SZu6TeMkx15ElkJ6CbsmkFwKgVx6CaRQJwFrMbPyrC0saRdJ69Zd77Hf0R7bT+rfVLvxIYmdAcHjzXt6I01aSGFdFP3rBC+2tl++2nndffN2991eb//gxual4TjguczNXQoWpdA4cMJJvCsMgkol3qazy6Z/O0djRa6/ukWBiYKJFpng4Dw16h0yBW6aZhUrpqI+/v6RfqKjXhidRG3RdRCvQEhWdT3a7/xm45yXCrXjEqwdxlHhkgqME1xi3WWlxQL4DCY49FCDQptU7fQ1/eCZMc1y4492tGUfOipQ1i5U6pXNrPZpryE39Yaly86TSuiidKj5fVBWSupy2nwFHQuD3MmFB8CN8LNSPgUD3PkP6zKN33iuFOhxxWbo6mGcVAy1LQ02WdUyjJkBPfEPrB+rUwNraiZbaRgvN6jb6/sbDdQ7wj59Jgnmk+eSvPGBy+80frrBdXDTP4k9/nIaXlyutrtD3pMjckxickYuyGdyTQaEkwX5Rf6Qv8FVYIMfwfJeGnRWnkPyqIKf/wGPvuL5</latexit><latexit sha1_base64="VbdH8fx5LxAqnAJvb1FUbnd9sOc=">AAACy3icdVHBahsxEJW3SZu6TeMkx15ElkJ6CbsmkFwKgVx6CaRQJwFrMbPyrC0saRdJ69Zd77Hf0R7bT+rfVLvxIYmdAcHjzXt6I01aSGFdFP3rBC+2tl++2nndffN2991eb//gxual4TjguczNXQoWpdA4cMJJvCsMgkol3qazy6Z/O0djRa6/ukWBiYKJFpng4Dw16h0yBW6aZhUrpqI+/v6RfqKjXhidRG3RdRCvQEhWdT3a7/xm45yXCrXjEqwdxlHhkgqME1xi3WWlxQL4DCY49FCDQptU7fQ1/eCZMc1y4492tGUfOipQ1i5U6pXNrPZpryE39Yaly86TSuiidKj5fVBWSupy2nwFHQuD3MmFB8CN8LNSPgUD3PkP6zKN33iuFOhxxWbo6mGcVAy1LQ02WdUyjJkBPfEPrB+rUwNraiZbaRgvN6jb6/sbDdQ7wj59Jgnmk+eSvPGBy+80frrBdXDTP4k9/nIaXlyutrtD3pMjckxickYuyGdyTQaEkwX5Rf6Qv8FVYIMfwfJeGnRWnkPyqIKf/wGPvuL5</latexit><latexit sha1_base64="VbdH8fx5LxAqnAJvb1FUbnd9sOc=">AAACy3icdVHBahsxEJW3SZu6TeMkx15ElkJ6CbsmkFwKgVx6CaRQJwFrMbPyrC0saRdJ69Zd77Hf0R7bT+rfVLvxIYmdAcHjzXt6I01aSGFdFP3rBC+2tl++2nndffN2991eb//gxual4TjguczNXQoWpdA4cMJJvCsMgkol3qazy6Z/O0djRa6/ukWBiYKJFpng4Dw16h0yBW6aZhUrpqI+/v6RfqKjXhidRG3RdRCvQEhWdT3a7/xm45yXCrXjEqwdxlHhkgqME1xi3WWlxQL4DCY49FCDQptU7fQ1/eCZMc1y4492tGUfOipQ1i5U6pXNrPZpryE39Yaly86TSuiidKj5fVBWSupy2nwFHQuD3MmFB8CN8LNSPgUD3PkP6zKN33iuFOhxxWbo6mGcVAy1LQ02WdUyjJkBPfEPrB+rUwNraiZbaRgvN6jb6/sbDdQ7wj59Jgnmk+eSvPGBy+80frrBdXDTP4k9/nIaXlyutrtD3pMjckxickYuyGdyTQaEkwX5Rf6Qv8FVYIMfwfJeGnRWnkPyqIKf/wGPvuL5</latexit><latexit sha1_base64="VbdH8fx5LxAqnAJvb1FUbnd9sOc=">AAACy3icdVHBahsxEJW3SZu6TeMkx15ElkJ6CbsmkFwKgVx6CaRQJwFrMbPyrC0saRdJ69Zd77Hf0R7bT+rfVLvxIYmdAcHjzXt6I01aSGFdFP3rBC+2tl++2nndffN2991eb//gxual4TjguczNXQoWpdA4cMJJvCsMgkol3qazy6Z/O0djRa6/ukWBiYKJFpng4Dw16h0yBW6aZhUrpqI+/v6RfqKjXhidRG3RdRCvQEhWdT3a7/xm45yXCrXjEqwdxlHhkgqME1xi3WWlxQL4DCY49FCDQptU7fQ1/eCZMc1y4492tGUfOipQ1i5U6pXNrPZpryE39Yaly86TSuiidKj5fVBWSupy2nwFHQuD3MmFB8CN8LNSPgUD3PkP6zKN33iuFOhxxWbo6mGcVAy1LQ02WdUyjJkBPfEPrB+rUwNraiZbaRgvN6jb6/sbDdQ7wj59Jgnmk+eSvPGBy+80frrBdXDTP4k9/nIaXlyutrtD3pMjckxickYuyGdyTQaEkwX5Rf6Qv8FVYIMfwfJeGnRWnkPyqIKf/wGPvuL5</latexit>

[<latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit><latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit><latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit><latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit>

[<latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit> <latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit> <latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit> <latexit sha1_base64="yFkxOj6WHNRIPz4y+vEmXq9E/kk=">AAACt3icdVHLTsMwEHTDu7zhyMUiQuKEkgoJuCFx4QgSBaQkqjbuprVqO8F2iqrQL+AKd36Lv8ENPQAtK1kazc541t60ENzYIPhseAuLS8srq2vN9Y3Nre2d3b17k5eaYZvlItePKRgUXGHbcivwsdAIMhX4kA6uJv2HIWrDc3VnRwUmEnqKZ5yBddRt1Nnxg5OgLjoLwinwybRuOruNj7ibs1KiskyAMVEYFDapQFvOBI6bcWmwADaAHkYOKpBokqqedEyPHNOlWa7dUZbW7E9HBdKYkUydUoLtm7+9CTmvF5U2O08qrorSomLfQVkpqM3p5Nm0yzUyK0YOANPczUpZHzQw6z6nGSt8ZrmUoLpVPEA7jsKkilGZUuMkq3rxw1iD6rkHjn+rUw0z6ljUUj98maOur2/NNVDn8Fv0nyQY9v5LcsYfLrfT8O8GZ8F96yR0+PbUv7yabneVHJBDckxCckYuyTW5IW3CCJJX8kbevQuv42Ve/1vqNaaeffKrvKcvatvbxg==</latexit>

gives symmetry x $ (1� x)<latexit sha1_base64="jyCALn3cBDUnctfwF1fm4q3o4vY=">AAACz3icdVHLahsxFJWnr9R9xGkXXXQjOhTSRcPIFNpFF4FuukygTgKewdyR74yF9RgkTRwzmdJt/6LQbftB/ZvKEy+S2LkgOJx7jo6ubl5J4XyS/OtF9+4/ePho53H/ydNnz3cHey9OnKktxxE30tizHBxKoXHkhZd4VlkElUs8zedfVv3Tc7ROGP3NLyvMFJRaFIKDD9Rk8OqCphILb0U582CtWdB99v7i3WQQJwdJV3QTsDWIybqOJnu9X+nU8Fqh9lyCc2OWVD5rwHrBJbb9tHZYAZ9DieMANSh0WdNN0NK3gZnSwthwtKcde93RgHJuqfKgVOBn7nZvRW7rjWtffMoaoavao+ZXQUUtqTd09R10KixyL5cBALcivJXyGVjgPnxaP9W44EYp0NMmnaNvxyxrUtSutrjKai5jllrQZRiwvanOLWyoU9lJY3a5Rd1dP9xqoMERD+kdSXBe3pUUjNdcYafs9gY3wcnwgAV8/CE+/Lze7g55Td6QfcLIR3JIvpIjMiKctOQ3+UP+RsfRIvoe/biSRr215yW5UdHP/81P5KA=</latexit><latexit sha1_base64="jyCALn3cBDUnctfwF1fm4q3o4vY=">AAACz3icdVHLahsxFJWnr9R9xGkXXXQjOhTSRcPIFNpFF4FuukygTgKewdyR74yF9RgkTRwzmdJt/6LQbftB/ZvKEy+S2LkgOJx7jo6ubl5J4XyS/OtF9+4/ePho53H/ydNnz3cHey9OnKktxxE30tizHBxKoXHkhZd4VlkElUs8zedfVv3Tc7ROGP3NLyvMFJRaFIKDD9Rk8OqCphILb0U582CtWdB99v7i3WQQJwdJV3QTsDWIybqOJnu9X+nU8Fqh9lyCc2OWVD5rwHrBJbb9tHZYAZ9DieMANSh0WdNN0NK3gZnSwthwtKcde93RgHJuqfKgVOBn7nZvRW7rjWtffMoaoavao+ZXQUUtqTd09R10KixyL5cBALcivJXyGVjgPnxaP9W44EYp0NMmnaNvxyxrUtSutrjKai5jllrQZRiwvanOLWyoU9lJY3a5Rd1dP9xqoMERD+kdSXBe3pUUjNdcYafs9gY3wcnwgAV8/CE+/Lze7g55Td6QfcLIR3JIvpIjMiKctOQ3+UP+RsfRIvoe/biSRr215yW5UdHP/81P5KA=</latexit><latexit sha1_base64="jyCALn3cBDUnctfwF1fm4q3o4vY=">AAACz3icdVHLahsxFJWnr9R9xGkXXXQjOhTSRcPIFNpFF4FuukygTgKewdyR74yF9RgkTRwzmdJt/6LQbftB/ZvKEy+S2LkgOJx7jo6ubl5J4XyS/OtF9+4/ePho53H/ydNnz3cHey9OnKktxxE30tizHBxKoXHkhZd4VlkElUs8zedfVv3Tc7ROGP3NLyvMFJRaFIKDD9Rk8OqCphILb0U582CtWdB99v7i3WQQJwdJV3QTsDWIybqOJnu9X+nU8Fqh9lyCc2OWVD5rwHrBJbb9tHZYAZ9DieMANSh0WdNN0NK3gZnSwthwtKcde93RgHJuqfKgVOBn7nZvRW7rjWtffMoaoavao+ZXQUUtqTd09R10KixyL5cBALcivJXyGVjgPnxaP9W44EYp0NMmnaNvxyxrUtSutrjKai5jllrQZRiwvanOLWyoU9lJY3a5Rd1dP9xqoMERD+kdSXBe3pUUjNdcYafs9gY3wcnwgAV8/CE+/Lze7g55Td6QfcLIR3JIvpIjMiKctOQ3+UP+RsfRIvoe/biSRr215yW5UdHP/81P5KA=</latexit><latexit sha1_base64="jyCALn3cBDUnctfwF1fm4q3o4vY=">AAACz3icdVHLahsxFJWnr9R9xGkXXXQjOhTSRcPIFNpFF4FuukygTgKewdyR74yF9RgkTRwzmdJt/6LQbftB/ZvKEy+S2LkgOJx7jo6ubl5J4XyS/OtF9+4/ePho53H/ydNnz3cHey9OnKktxxE30tizHBxKoXHkhZd4VlkElUs8zedfVv3Tc7ROGP3NLyvMFJRaFIKDD9Rk8OqCphILb0U582CtWdB99v7i3WQQJwdJV3QTsDWIybqOJnu9X+nU8Fqh9lyCc2OWVD5rwHrBJbb9tHZYAZ9DieMANSh0WdNN0NK3gZnSwthwtKcde93RgHJuqfKgVOBn7nZvRW7rjWtffMoaoavao+ZXQUUtqTd09R10KixyL5cBALcivJXyGVjgPnxaP9W44EYp0NMmnaNvxyxrUtSutrjKai5jllrQZRiwvanOLWyoU9lJY3a5Rd1dP9xqoMERD+kdSXBe3pUUjNdcYafs9gY3wcnwgAV8/CE+/Lze7g55Td6QfcLIR3JIvpIjMiKctOQ3+UP+RsfRIvoe/biSRr215yW5UdHP/81P5KA=</latexit>

cos�⇡x/2

�<latexit sha1_base64="qMIkNTEaZeUFxvhwUFg+Ql5B7ns=">AAACznicdVFNbxMxEHWWrxI+moLEhYvFCqlcym6EKMdKXDgGibSV4lU060y2Vvyxsr2h0WbFlZ+BuMIf4t/g3ebQNulIlp5n3vPzzOSlFM4nyb9edO/+g4eP9h73nzx99nx/cPDi1JnKchxzI409z8GhFBrHXniJ56VFULnEs3zxua2fLdE6YfQ3vyoxU1BoMRccfEhNB68YN47lojhkpaCX9P2wvbybDuLkKOmCboN0A2KyidH0oPeLzQyvFGrPJTg3SZPSZzVYL7jEps8qhyXwBRQ4CVCDQpfVXQMNfRsyMzo3NhztaZe9rqhBObdSeWAq8Bfudq1N7qpNKj//lNVCl5VHza+M5pWk3tB2GnQmLHIvVwEAtyL8lfILsMB9mFmfafzOjVKgZzVboG8maVYz1K6y2HrV6zhlFnQRGmxusnMLW2wmO2qcrnewu+eHOwU0KOIhvcMJlsVdTkF4TRV2mt7e4DY4HR6lAX/9EJ983Gx3j7wmb8ghSckxOSFfyIiMCSdr8pv8IX+jUbSMmujHFTXqbTQvyY2Ifv4HrOjjvQ==</latexit><latexit sha1_base64="qMIkNTEaZeUFxvhwUFg+Ql5B7ns=">AAACznicdVFNbxMxEHWWrxI+moLEhYvFCqlcym6EKMdKXDgGibSV4lU060y2Vvyxsr2h0WbFlZ+BuMIf4t/g3ebQNulIlp5n3vPzzOSlFM4nyb9edO/+g4eP9h73nzx99nx/cPDi1JnKchxzI409z8GhFBrHXniJ56VFULnEs3zxua2fLdE6YfQ3vyoxU1BoMRccfEhNB68YN47lojhkpaCX9P2wvbybDuLkKOmCboN0A2KyidH0oPeLzQyvFGrPJTg3SZPSZzVYL7jEps8qhyXwBRQ4CVCDQpfVXQMNfRsyMzo3NhztaZe9rqhBObdSeWAq8Bfudq1N7qpNKj//lNVCl5VHza+M5pWk3tB2GnQmLHIvVwEAtyL8lfILsMB9mFmfafzOjVKgZzVboG8maVYz1K6y2HrV6zhlFnQRGmxusnMLW2wmO2qcrnewu+eHOwU0KOIhvcMJlsVdTkF4TRV2mt7e4DY4HR6lAX/9EJ983Gx3j7wmb8ghSckxOSFfyIiMCSdr8pv8IX+jUbSMmujHFTXqbTQvyY2Ifv4HrOjjvQ==</latexit><latexit sha1_base64="qMIkNTEaZeUFxvhwUFg+Ql5B7ns=">AAACznicdVFNbxMxEHWWrxI+moLEhYvFCqlcym6EKMdKXDgGibSV4lU060y2Vvyxsr2h0WbFlZ+BuMIf4t/g3ebQNulIlp5n3vPzzOSlFM4nyb9edO/+g4eP9h73nzx99nx/cPDi1JnKchxzI409z8GhFBrHXniJ56VFULnEs3zxua2fLdE6YfQ3vyoxU1BoMRccfEhNB68YN47lojhkpaCX9P2wvbybDuLkKOmCboN0A2KyidH0oPeLzQyvFGrPJTg3SZPSZzVYL7jEps8qhyXwBRQ4CVCDQpfVXQMNfRsyMzo3NhztaZe9rqhBObdSeWAq8Bfudq1N7qpNKj//lNVCl5VHza+M5pWk3tB2GnQmLHIvVwEAtyL8lfILsMB9mFmfafzOjVKgZzVboG8maVYz1K6y2HrV6zhlFnQRGmxusnMLW2wmO2qcrnewu+eHOwU0KOIhvcMJlsVdTkF4TRV2mt7e4DY4HR6lAX/9EJ983Gx3j7wmb8ghSckxOSFfyIiMCSdr8pv8IX+jUbSMmujHFTXqbTQvyY2Ifv4HrOjjvQ==</latexit><latexit sha1_base64="qMIkNTEaZeUFxvhwUFg+Ql5B7ns=">AAACznicdVFNbxMxEHWWrxI+moLEhYvFCqlcym6EKMdKXDgGibSV4lU060y2Vvyxsr2h0WbFlZ+BuMIf4t/g3ebQNulIlp5n3vPzzOSlFM4nyb9edO/+g4eP9h73nzx99nx/cPDi1JnKchxzI409z8GhFBrHXniJ56VFULnEs3zxua2fLdE6YfQ3vyoxU1BoMRccfEhNB68YN47lojhkpaCX9P2wvbybDuLkKOmCboN0A2KyidH0oPeLzQyvFGrPJTg3SZPSZzVYL7jEps8qhyXwBRQ4CVCDQpfVXQMNfRsyMzo3NhztaZe9rqhBObdSeWAq8Bfudq1N7qpNKj//lNVCl5VHza+M5pWk3tB2GnQmLHIvVwEAtyL8lfILsMB9mFmfafzOjVKgZzVboG8maVYz1K6y2HrV6zhlFnQRGmxusnMLW2wmO2qcrnewu+eHOwU0KOIhvcMJlsVdTkF4TRV2mt7e4DY4HR6lAX/9EJ983Gx3j7wmb8ghSckxOSFfyIiMCSdr8pv8IX+jUbSMmujHFTXqbTQvyY2Ifv4HrOjjvQ==</latexit>

sin�⇡x/2

�<latexit sha1_base64="cyPCi5KLVLUglSlPxhKDfv9iSyc=">AAACznicdVFNaxsxEJW3X4n7EaeFXHoRXQrpJd01pekx0EuOLtRJwFrMrDzeCEvaRdK6Meul1/6MkGv6h/Jvot34kMTOgOBp5j09zUxaSGFdFN10gmfPX7x8tbXdff3m7bud3u77E5uXhuOQ5zI3ZylYlELj0Akn8awwCCqVeJrOfjb10zkaK3L92y0KTBRkWkwFB+dT494es0KzVGT7rBD0gn7tN5cv414YHURt0HUQr0BIVjEY73Yu2STnpULtuARrR3FUuKQC4wSXWHdZabEAPoMMRx5qUGiTqm2gpp99ZkKnufFHO9pm7ysqUNYuVOqZCty5fVxrkptqo9JNfySV0EXpUPM7o2kpqctpMw06EQa5kwsPgBvh/0r5ORjgzs+syzT+4blSoCcVm6GrR3FSMdS2NNh4VcswZgZ05husH7JTA2tsJltqGC83sNvn+xsF1CvCPn3CCebZU05eeE/ldxo/3uA6OOkfxB7/+hYefV9td4t8JJ/IPonJITkix2RAhoSTJbki1+R/MAjmQR38vaMGnZXmA3kQwb9buPvjwg==</latexit><latexit sha1_base64="cyPCi5KLVLUglSlPxhKDfv9iSyc=">AAACznicdVFNaxsxEJW3X4n7EaeFXHoRXQrpJd01pekx0EuOLtRJwFrMrDzeCEvaRdK6Meul1/6MkGv6h/Jvot34kMTOgOBp5j09zUxaSGFdFN10gmfPX7x8tbXdff3m7bud3u77E5uXhuOQ5zI3ZylYlELj0Akn8awwCCqVeJrOfjb10zkaK3L92y0KTBRkWkwFB+dT494es0KzVGT7rBD0gn7tN5cv414YHURt0HUQr0BIVjEY73Yu2STnpULtuARrR3FUuKQC4wSXWHdZabEAPoMMRx5qUGiTqm2gpp99ZkKnufFHO9pm7ysqUNYuVOqZCty5fVxrkptqo9JNfySV0EXpUPM7o2kpqctpMw06EQa5kwsPgBvh/0r5ORjgzs+syzT+4blSoCcVm6GrR3FSMdS2NNh4VcswZgZ05husH7JTA2tsJltqGC83sNvn+xsF1CvCPn3CCebZU05eeE/ldxo/3uA6OOkfxB7/+hYefV9td4t8JJ/IPonJITkix2RAhoSTJbki1+R/MAjmQR38vaMGnZXmA3kQwb9buPvjwg==</latexit><latexit sha1_base64="cyPCi5KLVLUglSlPxhKDfv9iSyc=">AAACznicdVFNaxsxEJW3X4n7EaeFXHoRXQrpJd01pekx0EuOLtRJwFrMrDzeCEvaRdK6Meul1/6MkGv6h/Jvot34kMTOgOBp5j09zUxaSGFdFN10gmfPX7x8tbXdff3m7bud3u77E5uXhuOQ5zI3ZylYlELj0Akn8awwCCqVeJrOfjb10zkaK3L92y0KTBRkWkwFB+dT494es0KzVGT7rBD0gn7tN5cv414YHURt0HUQr0BIVjEY73Yu2STnpULtuARrR3FUuKQC4wSXWHdZabEAPoMMRx5qUGiTqm2gpp99ZkKnufFHO9pm7ysqUNYuVOqZCty5fVxrkptqo9JNfySV0EXpUPM7o2kpqctpMw06EQa5kwsPgBvh/0r5ORjgzs+syzT+4blSoCcVm6GrR3FSMdS2NNh4VcswZgZ05husH7JTA2tsJltqGC83sNvn+xsF1CvCPn3CCebZU05eeE/ldxo/3uA6OOkfxB7/+hYefV9td4t8JJ/IPonJITkix2RAhoSTJbki1+R/MAjmQR38vaMGnZXmA3kQwb9buPvjwg==</latexit><latexit sha1_base64="cyPCi5KLVLUglSlPxhKDfv9iSyc=">AAACznicdVFNaxsxEJW3X4n7EaeFXHoRXQrpJd01pekx0EuOLtRJwFrMrDzeCEvaRdK6Meul1/6MkGv6h/Jvot34kMTOgOBp5j09zUxaSGFdFN10gmfPX7x8tbXdff3m7bud3u77E5uXhuOQ5zI3ZylYlELj0Akn8awwCCqVeJrOfjb10zkaK3L92y0KTBRkWkwFB+dT494es0KzVGT7rBD0gn7tN5cv414YHURt0HUQr0BIVjEY73Yu2STnpULtuARrR3FUuKQC4wSXWHdZabEAPoMMRx5qUGiTqm2gpp99ZkKnufFHO9pm7ysqUNYuVOqZCty5fVxrkptqo9JNfySV0EXpUPM7o2kpqctpMw06EQa5kwsPgBvh/0r5ORjgzs+syzT+4blSoCcVm6GrR3FSMdS2NNh4VcswZgZ05husH7JTA2tsJltqGC83sNvn+xsF1CvCPn3CCebZU05eeE/ldxo/3uA6OOkfxB7/+hYefV9td4t8JJ/IPonJITkix2RAhoSTJbki1+R/MAjmQR38vaMGnZXmA3kQwb9buPvjwg==</latexit>

�(x) =<latexit sha1_base64="VbdH8fx5LxAqnAJvb1FUbnd9sOc=">AAACy3icdVHBahsxEJW3SZu6TeMkx15ElkJ6CbsmkFwKgVx6CaRQJwFrMbPyrC0saRdJ69Zd77Hf0R7bT+rfVLvxIYmdAcHjzXt6I01aSGFdFP3rBC+2tl++2nndffN2991eb//gxual4TjguczNXQoWpdA4cMJJvCsMgkol3qazy6Z/O0djRa6/ukWBiYKJFpng4Dw16h0yBW6aZhUrpqI+/v6RfqKjXhidRG3RdRCvQEhWdT3a7/xm45yXCrXjEqwdxlHhkgqME1xi3WWlxQL4DCY49FCDQptU7fQ1/eCZMc1y4492tGUfOipQ1i5U6pXNrPZpryE39Yaly86TSuiidKj5fVBWSupy2nwFHQuD3MmFB8CN8LNSPgUD3PkP6zKN33iuFOhxxWbo6mGcVAy1LQ02WdUyjJkBPfEPrB+rUwNraiZbaRgvN6jb6/sbDdQ7wj59Jgnmk+eSvPGBy+80frrBdXDTP4k9/nIaXlyutrtD3pMjckxickYuyGdyTQaEkwX5Rf6Qv8FVYIMfwfJeGnRWnkPyqIKf/wGPvuL5</latexit><latexit sha1_base64="VbdH8fx5LxAqnAJvb1FUbnd9sOc=">AAACy3icdVHBahsxEJW3SZu6TeMkx15ElkJ6CbsmkFwKgVx6CaRQJwFrMbPyrC0saRdJ69Zd77Hf0R7bT+rfVLvxIYmdAcHjzXt6I01aSGFdFP3rBC+2tl++2nndffN2991eb//gxual4TjguczNXQoWpdA4cMJJvCsMgkol3qazy6Z/O0djRa6/ukWBiYKJFpng4Dw16h0yBW6aZhUrpqI+/v6RfqKjXhidRG3RdRCvQEhWdT3a7/xm45yXCrXjEqwdxlHhkgqME1xi3WWlxQL4DCY49FCDQptU7fQ1/eCZMc1y4492tGUfOipQ1i5U6pXNrPZpryE39Yaly86TSuiidKj5fVBWSupy2nwFHQuD3MmFB8CN8LNSPgUD3PkP6zKN33iuFOhxxWbo6mGcVAy1LQ02WdUyjJkBPfEPrB+rUwNraiZbaRgvN6jb6/sbDdQ7wj59Jgnmk+eSvPGBy+80frrBdXDTP4k9/nIaXlyutrtD3pMjckxickYuyGdyTQaEkwX5Rf6Qv8FVYIMfwfJeGnRWnkPyqIKf/wGPvuL5</latexit><latexit sha1_base64="VbdH8fx5LxAqnAJvb1FUbnd9sOc=">AAACy3icdVHBahsxEJW3SZu6TeMkx15ElkJ6CbsmkFwKgVx6CaRQJwFrMbPyrC0saRdJ69Zd77Hf0R7bT+rfVLvxIYmdAcHjzXt6I01aSGFdFP3rBC+2tl++2nndffN2991eb//gxual4TjguczNXQoWpdA4cMJJvCsMgkol3qazy6Z/O0djRa6/ukWBiYKJFpng4Dw16h0yBW6aZhUrpqI+/v6RfqKjXhidRG3RdRCvQEhWdT3a7/xm45yXCrXjEqwdxlHhkgqME1xi3WWlxQL4DCY49FCDQptU7fQ1/eCZMc1y4492tGUfOipQ1i5U6pXNrPZpryE39Yaly86TSuiidKj5fVBWSupy2nwFHQuD3MmFB8CN8LNSPgUD3PkP6zKN33iuFOhxxWbo6mGcVAy1LQ02WdUyjJkBPfEPrB+rUwNraiZbaRgvN6jb6/sbDdQ7wj59Jgnmk+eSvPGBy+80frrBdXDTP4k9/nIaXlyutrtD3pMjckxickYuyGdyTQaEkwX5Rf6Qv8FVYIMfwfJeGnRWnkPyqIKf/wGPvuL5</latexit><latexit sha1_base64="VbdH8fx5LxAqnAJvb1FUbnd9sOc=">AAACy3icdVHBahsxEJW3SZu6TeMkx15ElkJ6CbsmkFwKgVx6CaRQJwFrMbPyrC0saRdJ69Zd77Hf0R7bT+rfVLvxIYmdAcHjzXt6I01aSGFdFP3rBC+2tl++2nndffN2991eb//gxual4TjguczNXQoWpdA4cMJJvCsMgkol3qazy6Z/O0djRa6/ukWBiYKJFpng4Dw16h0yBW6aZhUrpqI+/v6RfqKjXhidRG3RdRCvQEhWdT3a7/xm45yXCrXjEqwdxlHhkgqME1xi3WWlxQL4DCY49FCDQptU7fQ1/eCZMc1y4492tGUfOipQ1i5U6pXNrPZpryE39Yaly86TSuiidKj5fVBWSupy2nwFHQuD3MmFB8CN8LNSPgUD3PkP6zKN33iuFOhxxWbo6mGcVAy1LQ02WdUyjJkBPfEPrB+rUwNraiZbaRgvN6jb6/sbDdQ7wj59Jgnmk+eSvPGBy+80frrBdXDTP4k9/nIaXlyutrtD3pMjckxickYuyGdyTQaEkwX5Rf6Qv8FVYIMfwfJeGnRWnkPyqIKf/wGPvuL5</latexit>

Novikov, Trofimov, Oseledets, arxiv:1605.03795Stoudenmire, Schwab, arxiv:1605.05775

Page 81: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Main idea: factorize weight tensor

n1 n2 n3 n4 n5 n6

Wn1n2n3n4n5n6<latexit sha1_base64="U6gcqk8SvmT8IaK1ajycTJf1F20=">AAAC03icdVHBThsxEHW20NKUltAeuVhdIfWE1inQHpG49AhSQ5Cyq9WsMwlWbO/K9lJFZg+teu0/9NZr+Zz+TZ0lByBhpCc9zbzn8cwUlRTWJcm/TvRsY/P5i62X3Vfbr9/s9HbfXtiyNhwHvJSluSzAohQaB044iZeVQVCFxGExO13Uh9dorCj1VzevMFMw1WIiOLiQynt7w9zrnFGd9wM+BhwGHAUcN3kvTg6SNugqYUsSk2Wc5bud3+m45LVC7bgEa0csqVzmwTjBJTbdtLZYAZ/BFEeBalBoM99O0dD9kBnTSWkCtKNt9r7Dg7J2roqgVOCu7OPaIrmuNqrd5HPmha5qh5rfNZrUkrqSLlZCx8Igd3IeCHAjwl8pvwID3IXFdVON33ipFOixT2fomhHLfIra1gYXvfxNzFIDehoGbB6qCwMr6lS20pjdrFG3z/fXGmhwxH36RCe4nj7VKRjvucJN2eMLrpKL/gEL/PwwPjldXneL7JH35ANh5BM5IV/IGRkQTr6TP+QvuY0GkY9+RD/vpFFn6XlHHkT06z/ESuTx</latexit><latexit sha1_base64="U6gcqk8SvmT8IaK1ajycTJf1F20=">AAAC03icdVHBThsxEHW20NKUltAeuVhdIfWE1inQHpG49AhSQ5Cyq9WsMwlWbO/K9lJFZg+teu0/9NZr+Zz+TZ0lByBhpCc9zbzn8cwUlRTWJcm/TvRsY/P5i62X3Vfbr9/s9HbfXtiyNhwHvJSluSzAohQaB044iZeVQVCFxGExO13Uh9dorCj1VzevMFMw1WIiOLiQynt7w9zrnFGd9wM+BhwGHAUcN3kvTg6SNugqYUsSk2Wc5bud3+m45LVC7bgEa0csqVzmwTjBJTbdtLZYAZ/BFEeBalBoM99O0dD9kBnTSWkCtKNt9r7Dg7J2roqgVOCu7OPaIrmuNqrd5HPmha5qh5rfNZrUkrqSLlZCx8Igd3IeCHAjwl8pvwID3IXFdVON33ipFOixT2fomhHLfIra1gYXvfxNzFIDehoGbB6qCwMr6lS20pjdrFG3z/fXGmhwxH36RCe4nj7VKRjvucJN2eMLrpKL/gEL/PwwPjldXneL7JH35ANh5BM5IV/IGRkQTr6TP+QvuY0GkY9+RD/vpFFn6XlHHkT06z/ESuTx</latexit><latexit sha1_base64="U6gcqk8SvmT8IaK1ajycTJf1F20=">AAAC03icdVHBThsxEHW20NKUltAeuVhdIfWE1inQHpG49AhSQ5Cyq9WsMwlWbO/K9lJFZg+teu0/9NZr+Zz+TZ0lByBhpCc9zbzn8cwUlRTWJcm/TvRsY/P5i62X3Vfbr9/s9HbfXtiyNhwHvJSluSzAohQaB044iZeVQVCFxGExO13Uh9dorCj1VzevMFMw1WIiOLiQynt7w9zrnFGd9wM+BhwGHAUcN3kvTg6SNugqYUsSk2Wc5bud3+m45LVC7bgEa0csqVzmwTjBJTbdtLZYAZ/BFEeBalBoM99O0dD9kBnTSWkCtKNt9r7Dg7J2roqgVOCu7OPaIrmuNqrd5HPmha5qh5rfNZrUkrqSLlZCx8Igd3IeCHAjwl8pvwID3IXFdVON33ipFOixT2fomhHLfIra1gYXvfxNzFIDehoGbB6qCwMr6lS20pjdrFG3z/fXGmhwxH36RCe4nj7VKRjvucJN2eMLrpKL/gEL/PwwPjldXneL7JH35ANh5BM5IV/IGRkQTr6TP+QvuY0GkY9+RD/vpFFn6XlHHkT06z/ESuTx</latexit><latexit sha1_base64="U6gcqk8SvmT8IaK1ajycTJf1F20=">AAAC03icdVHBThsxEHW20NKUltAeuVhdIfWE1inQHpG49AhSQ5Cyq9WsMwlWbO/K9lJFZg+teu0/9NZr+Zz+TZ0lByBhpCc9zbzn8cwUlRTWJcm/TvRsY/P5i62X3Vfbr9/s9HbfXtiyNhwHvJSluSzAohQaB044iZeVQVCFxGExO13Uh9dorCj1VzevMFMw1WIiOLiQynt7w9zrnFGd9wM+BhwGHAUcN3kvTg6SNugqYUsSk2Wc5bud3+m45LVC7bgEa0csqVzmwTjBJTbdtLZYAZ/BFEeBalBoM99O0dD9kBnTSWkCtKNt9r7Dg7J2roqgVOCu7OPaIrmuNqrd5HPmha5qh5rfNZrUkrqSLlZCx8Igd3IeCHAjwl8pvwID3IXFdVON33ipFOixT2fomhHLfIra1gYXvfxNzFIDehoGbB6qCwMr6lS20pjdrFG3z/fXGmhwxH36RCe4nj7VKRjvucJN2eMLrpKL/gEL/PwwPjldXneL7JH35ANh5BM5IV/IGRkQTr6TP+QvuY0GkY9+RD/vpFFn6XlHHkT06z/ESuTx</latexit>

=<latexit sha1_base64="3Rh+Q29fFfTrx9nFQhnZ/uJsJHA=">AAACt3icdVFNSwMxEE3Xr1o/q0cvwUXwJLtFUA+C4MWjglWhu5TZdLYNTbJrklXK2l/gVe/+Lf+N6dqD2joQeLx5L2+SSXLBjQ2Cz5q3sLi0vFJfbaytb2xubTd37kxWaIZtlolMPyRgUHCFbcutwIdcI8hE4H0yvJz0759QG56pWzvKMZbQVzzlDKyjbs67235wFFRFZ0E4BT6Z1nW3WfuIehkrJCrLBBjTCYPcxiVoy5nAcSMqDObAhtDHjoMKJJq4rCYd0wPH9GiaaXeUpRX701GCNGYkE6eUYAfmb29Czut1CpuexiVXeWFRse+gtBDUZnTybNrjGpkVIweAae5mpWwAGph1n9OIFD6zTEpQvTIaoh13wriMUJlC4ySrfPHDSIPquweOf6sTDTPqSFRSP3yZo66ub801UOfwW/SfJHjq/5fkjD9cbqfh3w3OgrvWUejwzbF/cTndbp3skX1ySEJyQi7IFbkmbcIIklfyRt69M6/rpd7gW+rVpp5d8qu8xy8lIduo</latexit><latexit sha1_base64="3Rh+Q29fFfTrx9nFQhnZ/uJsJHA=">AAACt3icdVFNSwMxEE3Xr1o/q0cvwUXwJLtFUA+C4MWjglWhu5TZdLYNTbJrklXK2l/gVe/+Lf+N6dqD2joQeLx5L2+SSXLBjQ2Cz5q3sLi0vFJfbaytb2xubTd37kxWaIZtlolMPyRgUHCFbcutwIdcI8hE4H0yvJz0759QG56pWzvKMZbQVzzlDKyjbs67235wFFRFZ0E4BT6Z1nW3WfuIehkrJCrLBBjTCYPcxiVoy5nAcSMqDObAhtDHjoMKJJq4rCYd0wPH9GiaaXeUpRX701GCNGYkE6eUYAfmb29Czut1CpuexiVXeWFRse+gtBDUZnTybNrjGpkVIweAae5mpWwAGph1n9OIFD6zTEpQvTIaoh13wriMUJlC4ySrfPHDSIPquweOf6sTDTPqSFRSP3yZo66ub801UOfwW/SfJHjq/5fkjD9cbqfh3w3OgrvWUejwzbF/cTndbp3skX1ySEJyQi7IFbkmbcIIklfyRt69M6/rpd7gW+rVpp5d8qu8xy8lIduo</latexit><latexit sha1_base64="3Rh+Q29fFfTrx9nFQhnZ/uJsJHA=">AAACt3icdVFNSwMxEE3Xr1o/q0cvwUXwJLtFUA+C4MWjglWhu5TZdLYNTbJrklXK2l/gVe/+Lf+N6dqD2joQeLx5L2+SSXLBjQ2Cz5q3sLi0vFJfbaytb2xubTd37kxWaIZtlolMPyRgUHCFbcutwIdcI8hE4H0yvJz0759QG56pWzvKMZbQVzzlDKyjbs67235wFFRFZ0E4BT6Z1nW3WfuIehkrJCrLBBjTCYPcxiVoy5nAcSMqDObAhtDHjoMKJJq4rCYd0wPH9GiaaXeUpRX701GCNGYkE6eUYAfmb29Czut1CpuexiVXeWFRse+gtBDUZnTybNrjGpkVIweAae5mpWwAGph1n9OIFD6zTEpQvTIaoh13wriMUJlC4ySrfPHDSIPquweOf6sTDTPqSFRSP3yZo66ub801UOfwW/SfJHjq/5fkjD9cbqfh3w3OgrvWUejwzbF/cTndbp3skX1ySEJyQi7IFbkmbcIIklfyRt69M6/rpd7gW+rVpp5d8qu8xy8lIduo</latexit><latexit sha1_base64="3Rh+Q29fFfTrx9nFQhnZ/uJsJHA=">AAACt3icdVFNSwMxEE3Xr1o/q0cvwUXwJLtFUA+C4MWjglWhu5TZdLYNTbJrklXK2l/gVe/+Lf+N6dqD2joQeLx5L2+SSXLBjQ2Cz5q3sLi0vFJfbaytb2xubTd37kxWaIZtlolMPyRgUHCFbcutwIdcI8hE4H0yvJz0759QG56pWzvKMZbQVzzlDKyjbs67235wFFRFZ0E4BT6Z1nW3WfuIehkrJCrLBBjTCYPcxiVoy5nAcSMqDObAhtDHjoMKJJq4rCYd0wPH9GiaaXeUpRX701GCNGYkE6eUYAfmb29Czut1CpuexiVXeWFRse+gtBDUZnTybNrjGpkVIweAae5mpWwAGph1n9OIFD6zTEpQvTIaoh13wriMUJlC4ySrfPHDSIPquweOf6sTDTPqSFRSP3yZo66ub801UOfwW/SfJHjq/5fkjD9cbqfh3w3OgrvWUejwzbF/cTndbp3skX1ySEJyQi7IFbkmbcIIklfyRt69M6/rpd7gW+rVpp5d8qu8xy8lIduo</latexit>

=<latexit sha1_base64="3Rh+Q29fFfTrx9nFQhnZ/uJsJHA=">AAACt3icdVFNSwMxEE3Xr1o/q0cvwUXwJLtFUA+C4MWjglWhu5TZdLYNTbJrklXK2l/gVe/+Lf+N6dqD2joQeLx5L2+SSXLBjQ2Cz5q3sLi0vFJfbaytb2xubTd37kxWaIZtlolMPyRgUHCFbcutwIdcI8hE4H0yvJz0759QG56pWzvKMZbQVzzlDKyjbs67235wFFRFZ0E4BT6Z1nW3WfuIehkrJCrLBBjTCYPcxiVoy5nAcSMqDObAhtDHjoMKJJq4rCYd0wPH9GiaaXeUpRX701GCNGYkE6eUYAfmb29Czut1CpuexiVXeWFRse+gtBDUZnTybNrjGpkVIweAae5mpWwAGph1n9OIFD6zTEpQvTIaoh13wriMUJlC4ySrfPHDSIPquweOf6sTDTPqSFRSP3yZo66ub801UOfwW/SfJHjq/5fkjD9cbqfh3w3OgrvWUejwzbF/cTndbp3skX1ySEJyQi7IFbkmbcIIklfyRt69M6/rpd7gW+rVpp5d8qu8xy8lIduo</latexit><latexit sha1_base64="3Rh+Q29fFfTrx9nFQhnZ/uJsJHA=">AAACt3icdVFNSwMxEE3Xr1o/q0cvwUXwJLtFUA+C4MWjglWhu5TZdLYNTbJrklXK2l/gVe/+Lf+N6dqD2joQeLx5L2+SSXLBjQ2Cz5q3sLi0vFJfbaytb2xubTd37kxWaIZtlolMPyRgUHCFbcutwIdcI8hE4H0yvJz0759QG56pWzvKMZbQVzzlDKyjbs67235wFFRFZ0E4BT6Z1nW3WfuIehkrJCrLBBjTCYPcxiVoy5nAcSMqDObAhtDHjoMKJJq4rCYd0wPH9GiaaXeUpRX701GCNGYkE6eUYAfmb29Czut1CpuexiVXeWFRse+gtBDUZnTybNrjGpkVIweAae5mpWwAGph1n9OIFD6zTEpQvTIaoh13wriMUJlC4ySrfPHDSIPquweOf6sTDTPqSFRSP3yZo66ub801UOfwW/SfJHjq/5fkjD9cbqfh3w3OgrvWUejwzbF/cTndbp3skX1ySEJyQi7IFbkmbcIIklfyRt69M6/rpd7gW+rVpp5d8qu8xy8lIduo</latexit><latexit sha1_base64="3Rh+Q29fFfTrx9nFQhnZ/uJsJHA=">AAACt3icdVFNSwMxEE3Xr1o/q0cvwUXwJLtFUA+C4MWjglWhu5TZdLYNTbJrklXK2l/gVe/+Lf+N6dqD2joQeLx5L2+SSXLBjQ2Cz5q3sLi0vFJfbaytb2xubTd37kxWaIZtlolMPyRgUHCFbcutwIdcI8hE4H0yvJz0759QG56pWzvKMZbQVzzlDKyjbs67235wFFRFZ0E4BT6Z1nW3WfuIehkrJCrLBBjTCYPcxiVoy5nAcSMqDObAhtDHjoMKJJq4rCYd0wPH9GiaaXeUpRX701GCNGYkE6eUYAfmb29Czut1CpuexiVXeWFRse+gtBDUZnTybNrjGpkVIweAae5mpWwAGph1n9OIFD6zTEpQvTIaoh13wriMUJlC4ySrfPHDSIPquweOf6sTDTPqSFRSP3yZo66ub801UOfwW/SfJHjq/5fkjD9cbqfh3w3OgrvWUejwzbF/cTndbp3skX1ySEJyQi7IFbkmbcIIklfyRt69M6/rpd7gW+rVpp5d8qu8xy8lIduo</latexit><latexit sha1_base64="3Rh+Q29fFfTrx9nFQhnZ/uJsJHA=">AAACt3icdVFNSwMxEE3Xr1o/q0cvwUXwJLtFUA+C4MWjglWhu5TZdLYNTbJrklXK2l/gVe/+Lf+N6dqD2joQeLx5L2+SSXLBjQ2Cz5q3sLi0vFJfbaytb2xubTd37kxWaIZtlolMPyRgUHCFbcutwIdcI8hE4H0yvJz0759QG56pWzvKMZbQVzzlDKyjbs67235wFFRFZ0E4BT6Z1nW3WfuIehkrJCrLBBjTCYPcxiVoy5nAcSMqDObAhtDHjoMKJJq4rCYd0wPH9GiaaXeUpRX701GCNGYkE6eUYAfmb29Czut1CpuexiVXeWFRse+gtBDUZnTybNrjGpkVIweAae5mpWwAGph1n9OIFD6zTEpQvTIaoh13wriMUJlC4ySrfPHDSIPquweOf6sTDTPqSFRSP3yZo66ub801UOfwW/SfJHjq/5fkjD9cbqfh3w3OgrvWUejwzbF/cTndbp3skX1ySEJyQi7IFbkmbcIIklfyRt69M6/rpd7gW+rVpp5d8qu8xy8lIduo</latexit>

=<latexit sha1_base64="3Rh+Q29fFfTrx9nFQhnZ/uJsJHA=">AAACt3icdVFNSwMxEE3Xr1o/q0cvwUXwJLtFUA+C4MWjglWhu5TZdLYNTbJrklXK2l/gVe/+Lf+N6dqD2joQeLx5L2+SSXLBjQ2Cz5q3sLi0vFJfbaytb2xubTd37kxWaIZtlolMPyRgUHCFbcutwIdcI8hE4H0yvJz0759QG56pWzvKMZbQVzzlDKyjbs67235wFFRFZ0E4BT6Z1nW3WfuIehkrJCrLBBjTCYPcxiVoy5nAcSMqDObAhtDHjoMKJJq4rCYd0wPH9GiaaXeUpRX701GCNGYkE6eUYAfmb29Czut1CpuexiVXeWFRse+gtBDUZnTybNrjGpkVIweAae5mpWwAGph1n9OIFD6zTEpQvTIaoh13wriMUJlC4ySrfPHDSIPquweOf6sTDTPqSFRSP3yZo66ub801UOfwW/SfJHjq/5fkjD9cbqfh3w3OgrvWUejwzbF/cTndbp3skX1ySEJyQi7IFbkmbcIIklfyRt69M6/rpd7gW+rVpp5d8qu8xy8lIduo</latexit><latexit sha1_base64="3Rh+Q29fFfTrx9nFQhnZ/uJsJHA=">AAACt3icdVFNSwMxEE3Xr1o/q0cvwUXwJLtFUA+C4MWjglWhu5TZdLYNTbJrklXK2l/gVe/+Lf+N6dqD2joQeLx5L2+SSXLBjQ2Cz5q3sLi0vFJfbaytb2xubTd37kxWaIZtlolMPyRgUHCFbcutwIdcI8hE4H0yvJz0759QG56pWzvKMZbQVzzlDKyjbs67235wFFRFZ0E4BT6Z1nW3WfuIehkrJCrLBBjTCYPcxiVoy5nAcSMqDObAhtDHjoMKJJq4rCYd0wPH9GiaaXeUpRX701GCNGYkE6eUYAfmb29Czut1CpuexiVXeWFRse+gtBDUZnTybNrjGpkVIweAae5mpWwAGph1n9OIFD6zTEpQvTIaoh13wriMUJlC4ySrfPHDSIPquweOf6sTDTPqSFRSP3yZo66ub801UOfwW/SfJHjq/5fkjD9cbqfh3w3OgrvWUejwzbF/cTndbp3skX1ySEJyQi7IFbkmbcIIklfyRt69M6/rpd7gW+rVpp5d8qu8xy8lIduo</latexit><latexit sha1_base64="3Rh+Q29fFfTrx9nFQhnZ/uJsJHA=">AAACt3icdVFNSwMxEE3Xr1o/q0cvwUXwJLtFUA+C4MWjglWhu5TZdLYNTbJrklXK2l/gVe/+Lf+N6dqD2joQeLx5L2+SSXLBjQ2Cz5q3sLi0vFJfbaytb2xubTd37kxWaIZtlolMPyRgUHCFbcutwIdcI8hE4H0yvJz0759QG56pWzvKMZbQVzzlDKyjbs67235wFFRFZ0E4BT6Z1nW3WfuIehkrJCrLBBjTCYPcxiVoy5nAcSMqDObAhtDHjoMKJJq4rCYd0wPH9GiaaXeUpRX701GCNGYkE6eUYAfmb29Czut1CpuexiVXeWFRse+gtBDUZnTybNrjGpkVIweAae5mpWwAGph1n9OIFD6zTEpQvTIaoh13wriMUJlC4ySrfPHDSIPquweOf6sTDTPqSFRSP3yZo66ub801UOfwW/SfJHjq/5fkjD9cbqfh3w3OgrvWUejwzbF/cTndbp3skX1ySEJyQi7IFbkmbcIIklfyRt69M6/rpd7gW+rVpp5d8qu8xy8lIduo</latexit><latexit sha1_base64="3Rh+Q29fFfTrx9nFQhnZ/uJsJHA=">AAACt3icdVFNSwMxEE3Xr1o/q0cvwUXwJLtFUA+C4MWjglWhu5TZdLYNTbJrklXK2l/gVe/+Lf+N6dqD2joQeLx5L2+SSXLBjQ2Cz5q3sLi0vFJfbaytb2xubTd37kxWaIZtlolMPyRgUHCFbcutwIdcI8hE4H0yvJz0759QG56pWzvKMZbQVzzlDKyjbs67235wFFRFZ0E4BT6Z1nW3WfuIehkrJCrLBBjTCYPcxiVoy5nAcSMqDObAhtDHjoMKJJq4rCYd0wPH9GiaaXeUpRX701GCNGYkE6eUYAfmb29Czut1CpuexiVXeWFRse+gtBDUZnTybNrjGpkVIweAae5mpWwAGph1n9OIFD6zTEpQvTIaoh13wriMUJlC4ySrfPHDSIPquweOf6sTDTPqSFRSP3yZo66ub801UOfwW/SfJHjq/5fkjD9cbqfh3w3OgrvWUejwzbF/cTndbp3skX1ySEJyQi7IFbkmbcIIklfyRt69M6/rpd7gW+rVpp5d8qu8xy8lIduo</latexit>(<latexit sha1_base64="KQHtcgJVGHx2UIIBmsStyLu1FHY=">AAACt3icdVHLTsMwEHTDuzwLRy4WERInlFRIwA2JC8ciUR5qomrjblqrthNsB1SFfgFXuPNb/A1u6AFoWcnSaHbGs/YmueDGBsFnzVtYXFpeWV2rr29sbm3vNHZvTVZohm2WiUzfJ2BQcIVty63A+1wjyETgXTK8nPTvnlAbnqkbO8oxltBXPOUMrKOuj7o7fnAcVEVnQTgFPplWq9uofUS9jBUSlWUCjOmEQW7jErTlTOC4HhUGc2BD6GPHQQUSTVxWk47poWN6NM20O8rSiv3pKEEaM5KJU0qwA/O3NyHn9TqFTc/ikqu8sKjYd1BaCGozOnk27XGNzIqRA8A0d7NSNgANzLrPqUcKn1kmJaheGQ3RjjthXEaoTKFxklW++GGkQfXdA8e/1YmGGXUkKqkfvsxRV9c35xqoc/hN+k8SPPX/S3LGHy630/DvBmfBbfM4dPj6xL94mG53leyTA3JEQnJKLsgVaZE2YQTJK3kj79651/VSb/At9WpTzx75Vd7jF/rh26k=</latexit><latexit sha1_base64="KQHtcgJVGHx2UIIBmsStyLu1FHY=">AAACt3icdVHLTsMwEHTDuzwLRy4WERInlFRIwA2JC8ciUR5qomrjblqrthNsB1SFfgFXuPNb/A1u6AFoWcnSaHbGs/YmueDGBsFnzVtYXFpeWV2rr29sbm3vNHZvTVZohm2WiUzfJ2BQcIVty63A+1wjyETgXTK8nPTvnlAbnqkbO8oxltBXPOUMrKOuj7o7fnAcVEVnQTgFPplWq9uofUS9jBUSlWUCjOmEQW7jErTlTOC4HhUGc2BD6GPHQQUSTVxWk47poWN6NM20O8rSiv3pKEEaM5KJU0qwA/O3NyHn9TqFTc/ikqu8sKjYd1BaCGozOnk27XGNzIqRA8A0d7NSNgANzLrPqUcKn1kmJaheGQ3RjjthXEaoTKFxklW++GGkQfXdA8e/1YmGGXUkKqkfvsxRV9c35xqoc/hN+k8SPPX/S3LGHy630/DvBmfBbfM4dPj6xL94mG53leyTA3JEQnJKLsgVaZE2YQTJK3kj79651/VSb/At9WpTzx75Vd7jF/rh26k=</latexit><latexit sha1_base64="KQHtcgJVGHx2UIIBmsStyLu1FHY=">AAACt3icdVHLTsMwEHTDuzwLRy4WERInlFRIwA2JC8ciUR5qomrjblqrthNsB1SFfgFXuPNb/A1u6AFoWcnSaHbGs/YmueDGBsFnzVtYXFpeWV2rr29sbm3vNHZvTVZohm2WiUzfJ2BQcIVty63A+1wjyETgXTK8nPTvnlAbnqkbO8oxltBXPOUMrKOuj7o7fnAcVEVnQTgFPplWq9uofUS9jBUSlWUCjOmEQW7jErTlTOC4HhUGc2BD6GPHQQUSTVxWk47poWN6NM20O8rSiv3pKEEaM5KJU0qwA/O3NyHn9TqFTc/ikqu8sKjYd1BaCGozOnk27XGNzIqRA8A0d7NSNgANzLrPqUcKn1kmJaheGQ3RjjthXEaoTKFxklW++GGkQfXdA8e/1YmGGXUkKqkfvsxRV9c35xqoc/hN+k8SPPX/S3LGHy630/DvBmfBbfM4dPj6xL94mG53leyTA3JEQnJKLsgVaZE2YQTJK3kj79651/VSb/At9WpTzx75Vd7jF/rh26k=</latexit><latexit sha1_base64="KQHtcgJVGHx2UIIBmsStyLu1FHY=">AAACt3icdVHLTsMwEHTDuzwLRy4WERInlFRIwA2JC8ciUR5qomrjblqrthNsB1SFfgFXuPNb/A1u6AFoWcnSaHbGs/YmueDGBsFnzVtYXFpeWV2rr29sbm3vNHZvTVZohm2WiUzfJ2BQcIVty63A+1wjyETgXTK8nPTvnlAbnqkbO8oxltBXPOUMrKOuj7o7fnAcVEVnQTgFPplWq9uofUS9jBUSlWUCjOmEQW7jErTlTOC4HhUGc2BD6GPHQQUSTVxWk47poWN6NM20O8rSiv3pKEEaM5KJU0qwA/O3NyHn9TqFTc/ikqu8sKjYd1BaCGozOnk27XGNzIqRA8A0d7NSNgANzLrPqUcKn1kmJaheGQ3RjjthXEaoTKFxklW++GGkQfXdA8e/1YmGGXUkKqkfvsxRV9c35xqoc/hN+k8SPPX/S3LGHy630/DvBmfBbfM4dPj6xL94mG53leyTA3JEQnJKLsgVaZE2YQTJK3kj79651/VSb/At9WpTzx75Vd7jF/rh26k=</latexit>

(<latexit sha1_base64="KQHtcgJVGHx2UIIBmsStyLu1FHY=">AAACt3icdVHLTsMwEHTDuzwLRy4WERInlFRIwA2JC8ciUR5qomrjblqrthNsB1SFfgFXuPNb/A1u6AFoWcnSaHbGs/YmueDGBsFnzVtYXFpeWV2rr29sbm3vNHZvTVZohm2WiUzfJ2BQcIVty63A+1wjyETgXTK8nPTvnlAbnqkbO8oxltBXPOUMrKOuj7o7fnAcVEVnQTgFPplWq9uofUS9jBUSlWUCjOmEQW7jErTlTOC4HhUGc2BD6GPHQQUSTVxWk47poWN6NM20O8rSiv3pKEEaM5KJU0qwA/O3NyHn9TqFTc/ikqu8sKjYd1BaCGozOnk27XGNzIqRA8A0d7NSNgANzLrPqUcKn1kmJaheGQ3RjjthXEaoTKFxklW++GGkQfXdA8e/1YmGGXUkKqkfvsxRV9c35xqoc/hN+k8SPPX/S3LGHy630/DvBmfBbfM4dPj6xL94mG53leyTA3JEQnJKLsgVaZE2YQTJK3kj79651/VSb/At9WpTzx75Vd7jF/rh26k=</latexit> <latexit sha1_base64="KQHtcgJVGHx2UIIBmsStyLu1FHY=">AAACt3icdVHLTsMwEHTDuzwLRy4WERInlFRIwA2JC8ciUR5qomrjblqrthNsB1SFfgFXuPNb/A1u6AFoWcnSaHbGs/YmueDGBsFnzVtYXFpeWV2rr29sbm3vNHZvTVZohm2WiUzfJ2BQcIVty63A+1wjyETgXTK8nPTvnlAbnqkbO8oxltBXPOUMrKOuj7o7fnAcVEVnQTgFPplWq9uofUS9jBUSlWUCjOmEQW7jErTlTOC4HhUGc2BD6GPHQQUSTVxWk47poWN6NM20O8rSiv3pKEEaM5KJU0qwA/O3NyHn9TqFTc/ikqu8sKjYd1BaCGozOnk27XGNzIqRA8A0d7NSNgANzLrPqUcKn1kmJaheGQ3RjjthXEaoTKFxklW++GGkQfXdA8e/1YmGGXUkKqkfvsxRV9c35xqoc/hN+k8SPPX/S3LGHy630/DvBmfBbfM4dPj6xL94mG53leyTA3JEQnJKLsgVaZE2YQTJK3kj79651/VSb/At9WpTzx75Vd7jF/rh26k=</latexit> <latexit sha1_base64="KQHtcgJVGHx2UIIBmsStyLu1FHY=">AAACt3icdVHLTsMwEHTDuzwLRy4WERInlFRIwA2JC8ciUR5qomrjblqrthNsB1SFfgFXuPNb/A1u6AFoWcnSaHbGs/YmueDGBsFnzVtYXFpeWV2rr29sbm3vNHZvTVZohm2WiUzfJ2BQcIVty63A+1wjyETgXTK8nPTvnlAbnqkbO8oxltBXPOUMrKOuj7o7fnAcVEVnQTgFPplWq9uofUS9jBUSlWUCjOmEQW7jErTlTOC4HhUGc2BD6GPHQQUSTVxWk47poWN6NM20O8rSiv3pKEEaM5KJU0qwA/O3NyHn9TqFTc/ikqu8sKjYd1BaCGozOnk27XGNzIqRA8A0d7NSNgANzLrPqUcKn1kmJaheGQ3RjjthXEaoTKFxklW++GGkQfXdA8e/1YmGGXUkKqkfvsxRV9c35xqoc/hN+k8SPPX/S3LGHy630/DvBmfBbfM4dPj6xL94mG53leyTA3JEQnJKLsgVaZE2YQTJK3kj79651/VSb/At9WpTzx75Vd7jF/rh26k=</latexit> <latexit sha1_base64="KQHtcgJVGHx2UIIBmsStyLu1FHY=">AAACt3icdVHLTsMwEHTDuzwLRy4WERInlFRIwA2JC8ciUR5qomrjblqrthNsB1SFfgFXuPNb/A1u6AFoWcnSaHbGs/YmueDGBsFnzVtYXFpeWV2rr29sbm3vNHZvTVZohm2WiUzfJ2BQcIVty63A+1wjyETgXTK8nPTvnlAbnqkbO8oxltBXPOUMrKOuj7o7fnAcVEVnQTgFPplWq9uofUS9jBUSlWUCjOmEQW7jErTlTOC4HhUGc2BD6GPHQQUSTVxWk47poWN6NM20O8rSiv3pKEEaM5KJU0qwA/O3NyHn9TqFTc/ikqu8sKjYd1BaCGozOnk27XGNzIqRA8A0d7NSNgANzLrPqUcKn1kmJaheGQ3RjjthXEaoTKFxklW++GGkQfXdA8e/1YmGGXUkKqkfvsxRV9c35xqoc/hN+k8SPPX/S3LGHy630/DvBmfBbfM4dPj6xL94mG53leyTA3JEQnJKLsgVaZE2YQTJK3kj79651/VSb/At9WpTzx75Vd7jF/rh26k=</latexit>

or or other tensor networks

Page 82: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Can use as a model class for supervised or unsupervised learning

n1 n2 n3 n4 n5 n6

xn11 xn2

2 · · ·xn66

<latexit sha1_base64="MAeWz15Zfj1IvCew38plYFGnY8Y=">AAAC3HicdVFNb9QwEPWmfJTla1uOXCwCEqcqjlDhWKkXjkVi20qbNJo4s1trHTuyndJVmhsnUK898xO4wj/h3+Cke2i725EsPb15z8/jySsprIuif4Ng48HDR483nwyfPnv+4uVoa/vQ6tpwHHMttTnOwaIUCsdOOInHlUEoc4lH+Xy/6x+dobFCq69uUWFawkyJqeDgPJWN3p5n7KRRGWvpeRZ3KG5pwgvtrCd2O2K3zUZhtBP1RVcBW4KQLOsg2xr8SgrN6xKV4xKsnbCocmkDxgkusR0mtcUK+BxmOPFQQYk2bfpxWvrOMwWdauOPcrRnbzoaKK1dlLlXluBO7d1eR67rTWo3/ZQ2QlW1Q8Wvg6a1pE7T7m9oIQxyJxceADfCv5XyUzDAnf/BYaLwG9dlCapokjm6dsLSJkFla4NdVnMRssSAmvkB29vq3MCKOpG9NGQXa9T99fFaA/WOMKb3JMHZ7L4kb7zh8jtldze4Cg7jHebxlw/h3v5yu5vkNXlD3hNGPpI98pkckDHh5Af5Tf6Qv8FJ8D34GVxeS4PB0vOK3Krg6j9hG+nz</latexit><latexit sha1_base64="MAeWz15Zfj1IvCew38plYFGnY8Y=">AAAC3HicdVFNb9QwEPWmfJTla1uOXCwCEqcqjlDhWKkXjkVi20qbNJo4s1trHTuyndJVmhsnUK898xO4wj/h3+Cke2i725EsPb15z8/jySsprIuif4Ng48HDR483nwyfPnv+4uVoa/vQ6tpwHHMttTnOwaIUCsdOOInHlUEoc4lH+Xy/6x+dobFCq69uUWFawkyJqeDgPJWN3p5n7KRRGWvpeRZ3KG5pwgvtrCd2O2K3zUZhtBP1RVcBW4KQLOsg2xr8SgrN6xKV4xKsnbCocmkDxgkusR0mtcUK+BxmOPFQQYk2bfpxWvrOMwWdauOPcrRnbzoaKK1dlLlXluBO7d1eR67rTWo3/ZQ2QlW1Q8Wvg6a1pE7T7m9oIQxyJxceADfCv5XyUzDAnf/BYaLwG9dlCapokjm6dsLSJkFla4NdVnMRssSAmvkB29vq3MCKOpG9NGQXa9T99fFaA/WOMKb3JMHZ7L4kb7zh8jtldze4Cg7jHebxlw/h3v5yu5vkNXlD3hNGPpI98pkckDHh5Af5Tf6Qv8FJ8D34GVxeS4PB0vOK3Krg6j9hG+nz</latexit><latexit sha1_base64="MAeWz15Zfj1IvCew38plYFGnY8Y=">AAAC3HicdVFNb9QwEPWmfJTla1uOXCwCEqcqjlDhWKkXjkVi20qbNJo4s1trHTuyndJVmhsnUK898xO4wj/h3+Cke2i725EsPb15z8/jySsprIuif4Ng48HDR483nwyfPnv+4uVoa/vQ6tpwHHMttTnOwaIUCsdOOInHlUEoc4lH+Xy/6x+dobFCq69uUWFawkyJqeDgPJWN3p5n7KRRGWvpeRZ3KG5pwgvtrCd2O2K3zUZhtBP1RVcBW4KQLOsg2xr8SgrN6xKV4xKsnbCocmkDxgkusR0mtcUK+BxmOPFQQYk2bfpxWvrOMwWdauOPcrRnbzoaKK1dlLlXluBO7d1eR67rTWo3/ZQ2QlW1Q8Wvg6a1pE7T7m9oIQxyJxceADfCv5XyUzDAnf/BYaLwG9dlCapokjm6dsLSJkFla4NdVnMRssSAmvkB29vq3MCKOpG9NGQXa9T99fFaA/WOMKb3JMHZ7L4kb7zh8jtldze4Cg7jHebxlw/h3v5yu5vkNXlD3hNGPpI98pkckDHh5Af5Tf6Qv8FJ8D34GVxeS4PB0vOK3Krg6j9hG+nz</latexit><latexit sha1_base64="MAeWz15Zfj1IvCew38plYFGnY8Y=">AAAC3HicdVFNb9QwEPWmfJTla1uOXCwCEqcqjlDhWKkXjkVi20qbNJo4s1trHTuyndJVmhsnUK898xO4wj/h3+Cke2i725EsPb15z8/jySsprIuif4Ng48HDR483nwyfPnv+4uVoa/vQ6tpwHHMttTnOwaIUCsdOOInHlUEoc4lH+Xy/6x+dobFCq69uUWFawkyJqeDgPJWN3p5n7KRRGWvpeRZ3KG5pwgvtrCd2O2K3zUZhtBP1RVcBW4KQLOsg2xr8SgrN6xKV4xKsnbCocmkDxgkusR0mtcUK+BxmOPFQQYk2bfpxWvrOMwWdauOPcrRnbzoaKK1dlLlXluBO7d1eR67rTWo3/ZQ2QlW1Q8Wvg6a1pE7T7m9oIQxyJxceADfCv5XyUzDAnf/BYaLwG9dlCapokjm6dsLSJkFla4NdVnMRssSAmvkB29vq3MCKOpG9NGQXa9T99fFaA/WOMKb3JMHZ7L4kb7zh8jtldze4Cg7jHebxlw/h3v5yu5vkNXlD3hNGPpI98pkckDHh5Af5Tf6Qv8FJ8D34GVxeS4PB0vOK3Krg6j9hG+nz</latexit>

f(x1, x2, . . . , x6) =X

n<latexit sha1_base64="L7JMMqA+Szwr1hEWM4iJM2GJF8M=">AAAC5HicdVHLatwwFNW4r3T6mrTLbkRNIYUh2ENouykEsukyhU4SGBkjy9cTMXoYSU4zKP6Drlq6Lf2Mbtu/6N9UdmaRZCYXBEfnnqN7r25RC25dkvwbRHfu3rv/YOvh8NHjJ0+fjbafH1ndGAZTpoU2JwW1ILiCqeNOwEltgMpCwHGxOOjyx2dgLNfqs1vWkEk6V7zijLpA5aNxtXOep2N8nk/GmIhSO9td3r7BHzCxjcw9kdSdFpVXbZuP4mQ36QOvg3QFYrSKw3x78IuUmjUSlGOCWjtLk9plnhrHmYB2SBoLNWULOodZgIpKsJnvx2rx68CUuNImHOVwz151eCqtXcoiKLse7c1cR27KzRpXvc88V3XjQLHLQlUjsNO4+yNccgPMiWUAlBkeesXslBrKXPjJIVHwhWkpqSo9WYBrZ2nmCSjbGOhq+Ys4JYaqeRiwva4uDF1TE9FL4/Rig7p/frLRgIMjnuBbKtGz+W2VgvGKK+w0vbnBdXA02U0D/rQX7x+struFXqJXaAel6B3aRx/RIZoihr6h3+gP+htV0dfoe/TjUhoNVp4X6FpEP/8DcEDsPg==</latexit><latexit sha1_base64="L7JMMqA+Szwr1hEWM4iJM2GJF8M=">AAAC5HicdVHLatwwFNW4r3T6mrTLbkRNIYUh2ENouykEsukyhU4SGBkjy9cTMXoYSU4zKP6Drlq6Lf2Mbtu/6N9UdmaRZCYXBEfnnqN7r25RC25dkvwbRHfu3rv/YOvh8NHjJ0+fjbafH1ndGAZTpoU2JwW1ILiCqeNOwEltgMpCwHGxOOjyx2dgLNfqs1vWkEk6V7zijLpA5aNxtXOep2N8nk/GmIhSO9td3r7BHzCxjcw9kdSdFpVXbZuP4mQ36QOvg3QFYrSKw3x78IuUmjUSlGOCWjtLk9plnhrHmYB2SBoLNWULOodZgIpKsJnvx2rx68CUuNImHOVwz151eCqtXcoiKLse7c1cR27KzRpXvc88V3XjQLHLQlUjsNO4+yNccgPMiWUAlBkeesXslBrKXPjJIVHwhWkpqSo9WYBrZ2nmCSjbGOhq+Ys4JYaqeRiwva4uDF1TE9FL4/Rig7p/frLRgIMjnuBbKtGz+W2VgvGKK+w0vbnBdXA02U0D/rQX7x+struFXqJXaAel6B3aRx/RIZoihr6h3+gP+htV0dfoe/TjUhoNVp4X6FpEP/8DcEDsPg==</latexit><latexit sha1_base64="L7JMMqA+Szwr1hEWM4iJM2GJF8M=">AAAC5HicdVHLatwwFNW4r3T6mrTLbkRNIYUh2ENouykEsukyhU4SGBkjy9cTMXoYSU4zKP6Drlq6Lf2Mbtu/6N9UdmaRZCYXBEfnnqN7r25RC25dkvwbRHfu3rv/YOvh8NHjJ0+fjbafH1ndGAZTpoU2JwW1ILiCqeNOwEltgMpCwHGxOOjyx2dgLNfqs1vWkEk6V7zijLpA5aNxtXOep2N8nk/GmIhSO9td3r7BHzCxjcw9kdSdFpVXbZuP4mQ36QOvg3QFYrSKw3x78IuUmjUSlGOCWjtLk9plnhrHmYB2SBoLNWULOodZgIpKsJnvx2rx68CUuNImHOVwz151eCqtXcoiKLse7c1cR27KzRpXvc88V3XjQLHLQlUjsNO4+yNccgPMiWUAlBkeesXslBrKXPjJIVHwhWkpqSo9WYBrZ2nmCSjbGOhq+Ys4JYaqeRiwva4uDF1TE9FL4/Rig7p/frLRgIMjnuBbKtGz+W2VgvGKK+w0vbnBdXA02U0D/rQX7x+struFXqJXaAel6B3aRx/RIZoihr6h3+gP+htV0dfoe/TjUhoNVp4X6FpEP/8DcEDsPg==</latexit><latexit sha1_base64="L7JMMqA+Szwr1hEWM4iJM2GJF8M=">AAAC5HicdVHLatwwFNW4r3T6mrTLbkRNIYUh2ENouykEsukyhU4SGBkjy9cTMXoYSU4zKP6Drlq6Lf2Mbtu/6N9UdmaRZCYXBEfnnqN7r25RC25dkvwbRHfu3rv/YOvh8NHjJ0+fjbafH1ndGAZTpoU2JwW1ILiCqeNOwEltgMpCwHGxOOjyx2dgLNfqs1vWkEk6V7zijLpA5aNxtXOep2N8nk/GmIhSO9td3r7BHzCxjcw9kdSdFpVXbZuP4mQ36QOvg3QFYrSKw3x78IuUmjUSlGOCWjtLk9plnhrHmYB2SBoLNWULOodZgIpKsJnvx2rx68CUuNImHOVwz151eCqtXcoiKLse7c1cR27KzRpXvc88V3XjQLHLQlUjsNO4+yNccgPMiWUAlBkeesXslBrKXPjJIVHwhWkpqSo9WYBrZ2nmCSjbGOhq+Ys4JYaqeRiwva4uDF1TE9FL4/Rig7p/frLRgIMjnuBbKtGz+W2VgvGKK+w0vbnBdXA02U0D/rQX7x+struFXqJXaAel6B3aRx/RIZoihr6h3+gP+htV0dfoe/TjUhoNVp4X6FpEP/8DcEDsPg==</latexit>

min1

|T |

|T |X

j=1

⇣f(xj)� yj

⌘2

<latexit sha1_base64="+xDfruhbLgNzhkvph6on1MX8/Ns=">AAADEXicdVFNThsxFHamfzT9C+2yG6ujSmFRNI6Q2g0SKpsuQSJAFQ8jj+MZnNieke2BRs6cohfoNboq6rYn4ABs2yvUM2QBJDzJ0qfvx89+Ly0FNzaKLjvBg4ePHj9Ze9p99vzFy1e99deHpqg0ZUNaiEIfp8QwwRUbWm4FOy41IzIV7Cid7jb60RnThhfqwM5KFkuSK55xSqynkt5XLLmCONOEOlS7OZbEnlIi3EE9ryE2lUzcZBvVJ0vSZ573s35Lppn7VieTDfgBzpJJo2ycDJJeGG1GbcFlgBYgBIvaS9Y7P/C4oJVkylJBjBmhqLSxI9pyKljdxZVhJaFTkrORh4pIZmLXzqCG7z0zhlmh/VEWtuzNhCPSmJlMvbN5srmrNeQqbVTZ7FPsuCoryxS9bpRVAtoCNgOFY64ZtWLmAaGa+7dCekr8PK0fexcrdk4LKYkaOzxlth6h2GGmTKVZ08vNQ4Q1Ubn/YH3bnWqy5MaitYZovsLdXj9YGYA+EQ7gPZ3IWX5fJx+8kfI7RXc3uAwOB5vI4/2tcGd3sd018Ba8A32AwEewA76APTAEFPwCV+Av+Bd8D34GF8Hva2vQWWTegFsV/PkP8U4AiA==</latexit><latexit sha1_base64="+xDfruhbLgNzhkvph6on1MX8/Ns=">AAADEXicdVFNThsxFHamfzT9C+2yG6ujSmFRNI6Q2g0SKpsuQSJAFQ8jj+MZnNieke2BRs6cohfoNboq6rYn4ABs2yvUM2QBJDzJ0qfvx89+Ly0FNzaKLjvBg4ePHj9Ze9p99vzFy1e99deHpqg0ZUNaiEIfp8QwwRUbWm4FOy41IzIV7Cid7jb60RnThhfqwM5KFkuSK55xSqynkt5XLLmCONOEOlS7OZbEnlIi3EE9ryE2lUzcZBvVJ0vSZ573s35Lppn7VieTDfgBzpJJo2ycDJJeGG1GbcFlgBYgBIvaS9Y7P/C4oJVkylJBjBmhqLSxI9pyKljdxZVhJaFTkrORh4pIZmLXzqCG7z0zhlmh/VEWtuzNhCPSmJlMvbN5srmrNeQqbVTZ7FPsuCoryxS9bpRVAtoCNgOFY64ZtWLmAaGa+7dCekr8PK0fexcrdk4LKYkaOzxlth6h2GGmTKVZ08vNQ4Q1Ubn/YH3bnWqy5MaitYZovsLdXj9YGYA+EQ7gPZ3IWX5fJx+8kfI7RXc3uAwOB5vI4/2tcGd3sd018Ba8A32AwEewA76APTAEFPwCV+Av+Bd8D34GF8Hva2vQWWTegFsV/PkP8U4AiA==</latexit><latexit sha1_base64="+xDfruhbLgNzhkvph6on1MX8/Ns=">AAADEXicdVFNThsxFHamfzT9C+2yG6ujSmFRNI6Q2g0SKpsuQSJAFQ8jj+MZnNieke2BRs6cohfoNboq6rYn4ABs2yvUM2QBJDzJ0qfvx89+Ly0FNzaKLjvBg4ePHj9Ze9p99vzFy1e99deHpqg0ZUNaiEIfp8QwwRUbWm4FOy41IzIV7Cid7jb60RnThhfqwM5KFkuSK55xSqynkt5XLLmCONOEOlS7OZbEnlIi3EE9ryE2lUzcZBvVJ0vSZ573s35Lppn7VieTDfgBzpJJo2ycDJJeGG1GbcFlgBYgBIvaS9Y7P/C4oJVkylJBjBmhqLSxI9pyKljdxZVhJaFTkrORh4pIZmLXzqCG7z0zhlmh/VEWtuzNhCPSmJlMvbN5srmrNeQqbVTZ7FPsuCoryxS9bpRVAtoCNgOFY64ZtWLmAaGa+7dCekr8PK0fexcrdk4LKYkaOzxlth6h2GGmTKVZ08vNQ4Q1Ubn/YH3bnWqy5MaitYZovsLdXj9YGYA+EQ7gPZ3IWX5fJx+8kfI7RXc3uAwOB5vI4/2tcGd3sd018Ba8A32AwEewA76APTAEFPwCV+Av+Bd8D34GF8Hva2vQWWTegFsV/PkP8U4AiA==</latexit><latexit sha1_base64="+xDfruhbLgNzhkvph6on1MX8/Ns=">AAADEXicdVFNThsxFHamfzT9C+2yG6ujSmFRNI6Q2g0SKpsuQSJAFQ8jj+MZnNieke2BRs6cohfoNboq6rYn4ABs2yvUM2QBJDzJ0qfvx89+Ly0FNzaKLjvBg4ePHj9Ze9p99vzFy1e99deHpqg0ZUNaiEIfp8QwwRUbWm4FOy41IzIV7Cid7jb60RnThhfqwM5KFkuSK55xSqynkt5XLLmCONOEOlS7OZbEnlIi3EE9ryE2lUzcZBvVJ0vSZ573s35Lppn7VieTDfgBzpJJo2ycDJJeGG1GbcFlgBYgBIvaS9Y7P/C4oJVkylJBjBmhqLSxI9pyKljdxZVhJaFTkrORh4pIZmLXzqCG7z0zhlmh/VEWtuzNhCPSmJlMvbN5srmrNeQqbVTZ7FPsuCoryxS9bpRVAtoCNgOFY64ZtWLmAaGa+7dCekr8PK0fexcrdk4LKYkaOzxlth6h2GGmTKVZ08vNQ4Q1Ubn/YH3bnWqy5MaitYZovsLdXj9YGYA+EQ7gPZ3IWX5fJx+8kfI7RXc3uAwOB5vI4/2tcGd3sd018Ba8A32AwEewA76APTAEFPwCV+Av+Bd8D34GF8Hva2vQWWTegFsV/PkP8U4AiA==</latexit>

Page 83: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Immediate payoff: adaptivity of weights

1) merge (contract) a pair of tensors

Page 84: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Immediate payoff: adaptivity of weights

1) merge (contract) a pair of tensors

Page 85: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Immediate payoff: adaptivity of weights

1) merge (contract) a pair of tensors

2) optimize parameters (e.g. gradient steps)

Page 86: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Immediate payoff: adaptivity of weights

1) merge (contract) a pair of tensors

2) optimize parameters (e.g. gradient steps)

3) SVD factorization to adapt size of bond index

Page 87: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Immediate payoff: adaptivity of weights

1) merge (contract) a pair of tensors

2) optimize parameters (e.g. gradient steps)

3) SVD factorization to adapt size of bond index

Page 88: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

How well do tensor networks perform for supervised & unsupervised learning?

What applications can be done?

Page 89: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Supervised Learning

MPS have been used as proof-of-principle for tensor networks in supervised learning

1. MNIST data set (99% test accuracy)Stoudenmire, Schwab, NIPS 29, 4799 (2016)

2. Fashion MNIST data set (89% test accuracy)Stoudenmire, Quant. Sci. Tech. 3, 034003 (2018)

Glasser, Pancotti, Cirac, arxiv:1806.05964 (2018)

10

FIG. 12. Using convolutional Neural Networks as feature vec-tor extractors from real data: the output of the CNN is seenas an image with a third dimension collecting the di↵erentfeatures. For each pixel of this image, the vector of featuresis contracted with the open legs of a tensor network.

V. NUMERICAL EXPERIMENTS

We test the generalized tensor network approachon the task of image classification, where a naturaltwo-dimensional geometry that can be reflected inthe architecture of the tensor network is present, aswell as on the task of urban sound recognition, wherethe time dimension provides a one-dimensional geometry.

A. Image classification

We first consider the MNIST dataset[48], which con-sists of 28 ⇥ 28 greyscale images of digits. There are 10classes and we adopt a multiclass classification procedurein which one tensor of the tensor network is parametrizedby the ten possible labels. The original training set issplit into training and validation sets of 55000 and 5000examples and the performance of the di↵erent models isevaluated on the test set of 10000 examples. We con-sider the following generalized tensor networks: a snake-SBS with 4 strings (Fig. 5b), a 2D-SBS (Fig. 5a), anEPS with a 2⇥ 2 translational-invariant plaquette com-bined with a linear classifier, (Fig. 7c), an EPS-SBS withtranslational-invariant plaquette combined with a snake-SBS (Fig. 6) and a CNN-snake-SBS which uses a 1-layerCNN as input features (Fig. 12). The CNN consideredhere uses a convolutional layer applying 6 5 ⇥ 5 filters(stride 1) with ReLU activation function and a poolinglayer performing max pooling with a 2 ⇥ 2 filter. Allother networks use the choice of features presented inEq. (26) and the greyscale values are normalized between0 and 1. We compare the performance of these networkswith a MPS and a RBM (the number of hidden units of250, 500, 750 or 1000 is taken as a hyperparameter). Allnetworks use a batch size of 20 examples and hyperpa-rameters such as the learning rate ↵, the regularizationrate � and number of iterations over the training set aredetermined through a grid search while evaluating theperformance on the validation set. Best performance istypically achieved with ↵ = 10�4, � = 0.95 and a hun-

(a) (b)

FIG. 13. Examples of images from the MNIST (a) and fashionMNIST (b) dataset.

FIG. 14. Test set accuracy of di↵erent generalized tensornetworks on the MNIST dataset.

dred iterations.The test set accuracy, presented in Fig. 14, shows that

even with a very small bond dimension generalized ten-sor network are able to accurately classify the dataset.Their performance is significantly better than that of atree tensor network[46] or a MPS trained in frequencyspace[45], and while a MPS can also achieve 99.03% ac-curacy with a bond dimension of 120 [15], the cost ofoptimizing very large tensors has prohibited the use ofthis method for larger problems so far. The snake-SBSwith bond dimension larger than 6 has also better per-formance than a RBM. Since the snake-SBS provides aninterpolation between RBM and MPS, the choice of num-ber of strings and geometry can be seen as additional pa-rameters which could be tuned further to improve overthe performance of both methods. All networks have atraining set accuracy very close to 100% when the bonddimension is larger than 6, and we expect that betterregularization techniques or network architectures haveto be developed to significantly increase the test set per-formances obtained here. We also optimized a snake-SBSwith positive elements (by parametrizing each element ina tensor as the exponential of the new parameters), whichis a graphical model. Using the same algorithm, we werenot able to achieve better performance than 93% classi-fication accuracy with bond dimensions up to 10. Thisshows that while having a structure closely related to

10

FIG. 12. Using convolutional Neural Networks as feature vec-tor extractors from real data: the output of the CNN is seenas an image with a third dimension collecting the di↵erentfeatures. For each pixel of this image, the vector of featuresis contracted with the open legs of a tensor network.

V. NUMERICAL EXPERIMENTS

We test the generalized tensor network approachon the task of image classification, where a naturaltwo-dimensional geometry that can be reflected inthe architecture of the tensor network is present, aswell as on the task of urban sound recognition, wherethe time dimension provides a one-dimensional geometry.

A. Image classification

We first consider the MNIST dataset[48], which con-sists of 28 ⇥ 28 greyscale images of digits. There are 10classes and we adopt a multiclass classification procedurein which one tensor of the tensor network is parametrizedby the ten possible labels. The original training set issplit into training and validation sets of 55000 and 5000examples and the performance of the di↵erent models isevaluated on the test set of 10000 examples. We con-sider the following generalized tensor networks: a snake-SBS with 4 strings (Fig. 5b), a 2D-SBS (Fig. 5a), anEPS with a 2⇥ 2 translational-invariant plaquette com-bined with a linear classifier, (Fig. 7c), an EPS-SBS withtranslational-invariant plaquette combined with a snake-SBS (Fig. 6) and a CNN-snake-SBS which uses a 1-layerCNN as input features (Fig. 12). The CNN consideredhere uses a convolutional layer applying 6 5 ⇥ 5 filters(stride 1) with ReLU activation function and a poolinglayer performing max pooling with a 2 ⇥ 2 filter. Allother networks use the choice of features presented inEq. (26) and the greyscale values are normalized between0 and 1. We compare the performance of these networkswith a MPS and a RBM (the number of hidden units of250, 500, 750 or 1000 is taken as a hyperparameter). Allnetworks use a batch size of 20 examples and hyperpa-rameters such as the learning rate ↵, the regularizationrate � and number of iterations over the training set aredetermined through a grid search while evaluating theperformance on the validation set. Best performance istypically achieved with ↵ = 10�4, � = 0.95 and a hun-

(a) (b)

FIG. 13. Examples of images from the MNIST (a) and fashionMNIST (b) dataset.

FIG. 14. Test set accuracy of di↵erent generalized tensornetworks on the MNIST dataset.

dred iterations.The test set accuracy, presented in Fig. 14, shows that

even with a very small bond dimension generalized ten-sor network are able to accurately classify the dataset.Their performance is significantly better than that of atree tensor network[46] or a MPS trained in frequencyspace[45], and while a MPS can also achieve 99.03% ac-curacy with a bond dimension of 120 [15], the cost ofoptimizing very large tensors has prohibited the use ofthis method for larger problems so far. The snake-SBSwith bond dimension larger than 6 has also better per-formance than a RBM. Since the snake-SBS provides aninterpolation between RBM and MPS, the choice of num-ber of strings and geometry can be seen as additional pa-rameters which could be tuned further to improve overthe performance of both methods. All networks have atraining set accuracy very close to 100% when the bonddimension is larger than 6, and we expect that betterregularization techniques or network architectures haveto be developed to significantly increase the test set per-formances obtained here. We also optimized a snake-SBSwith positive elements (by parametrizing each element ina tensor as the exponential of the new parameters), whichis a graphical model. Using the same algorithm, we werenot able to achieve better performance than 93% classi-fication accuracy with bond dimensions up to 10. Thisshows that while having a structure closely related to

Efthymiou, Hidary, Leichenauer, arxiv:1906.06329

Page 90: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Generative Modeling

MPS have been used to parameterize generative models

1. Trained by gradient optimization of log-likelihood 7

101 102

Dmax

0

10

20

30

40

50

60

70

NLL

ln(|T |)

Figure 5. Training NLL and sampling images for |T | = 100 bina-rized MNIST dataset. ln(|T |) is the theoretical minimum of NLL.The TTN exactly remembers all the information of the images whenDmax = |T |.

digits of 28 ⇥ 28 pixels of value 0 or 1. In order to facil-itate comparison with other work, we directly use the samestandard binary MNIST dataset that has been used in the anal-ysis of Deep Belief Networks and been widely recognized bythe machine learning community41. The dataset can be down-loaded directly from the corresponding website.42

We did three experiments on the binary MNIST dataset. Inthe first experiment we use 100 randomly selected images fortraining TTN with di↵erent Dmax. The results are shown inFigure 5 where we can see that with the NLL gradually de-creases, the quality of the generated samples becomes better.The training NLL would decrease to its theoretical minimumas Dmax increasing to |T | where the sampling image will beexactly the same as one in the training set.

In Figure 6 we plot the two-site correlation function of pix-els. In each row, we randomly select three pixels, then cal-culate the correlation function of the selected pixels with allothers pixels. The values of the correlations are represented bycolor. The real correlations extracted from the original data isillustrated in the top row, and correlations constructed fromlearned MPS and TTN are shown in the bottom rows for com-parison. For TTN and MPS, the Dmax is 50 and 100 respec-tively, which correspond to the models with the smallest testNLL. As we can see that in the original dataset, the correlationbetween pixels consists of short-range correlation and a smallnumber of long-range correlation. However, the MPS modelcan faithfully represent the short-range correlation of pixels,while the TTN model performs well in both short-range andlong-range correlations.

Next we carry out experiments using the whole MNISTdataset with 50, 000 training images to quantitatively com-pare the performance of TTN with existing popular machinelearning models. The performance is characterized by eval-uating NLL on the 10, 000 test images. We also applied thesame dataset to the tree-structure factor graph and the MPSgenerative model, and compare on the same dataset the testNLL with RBM, Variational AutoEncoder (VAE) and Pixel-

-0.200.20.40.60.8

-0.200.20.40.60.8

-0.200.20.40.60.8

-0.200.20.40.60.8

-0.200.20.40.60.8

-0.200.20.40.60.8

-0.200.20.40.60.8

-0.200.20.40.60.8

-0.200.20.40.60.8

Figure 6. Two-site correlation of pixels extracted from the originaldata (1st row), the MPS (2nd row) and the TTN model (3rd row).We randomly choose three pixels at the 10th row of the images. TheDmax of TTN is 50, the Dmax of MPS is 100, which correspond to themodels with the smallest test NLL.

Table I. Test NLL of di↵erent model for binary MNIST dataset

Model Test NLL

Tree factor graph 175.8MPS 101.5TTN-1d 96.9TTN-2d 94.3RBM 86.3* 41

VAE 84.8* 43

PixelCNN 81.3 10

* stands for approximated NLL.

CNN which currently gives the state-of-the-art performance.Among these results, RBM and VAE only evaluate approxi-mately the partition function, hence gives only approximateNLL. While TTN, MPS, together with PixelCNN are able toevaluate exactly the partition function and give exact NLL val-ues.

The results are shown in Table I, where we can see that thetest NLL obtained by the tree-structure factor graph is 175.8,the result of MPS is 101.45, with corresponding Dmax = 100.While for the TTN on 1-D data representation (as depicted inFig. 2(b)) with Dmax = 50, the test NLL already reduces to96.88. With the same Dmax, the TTN performed on 2-D datarepresentation (as depicted in Fig. 2(a,c)) can do even better,giving NLL around 94.25. However, we see from the tablethat when compared to the state-of-the-art machine learningmodels, the tensor network models still have a lot of spaceto improve: the RBM using 500 hidden neurons and 25-stepcontrastive divergence could reach NLL approximately 86.3,and the PixelCNN with 7 layers gives NLL around 81.3.

In Figure 7 we draw the sampled images from TTN trained

Han, Wang, Fan, Wang, Zhang, PRX 8, 031012 (2018)

Cheng, Wang, Xiang, Zhang, PRB 99, 155131 (2019)

MNIST data set

2. Trained by local-exact-solution algorithm

Stokes, Terilla, arxiv:1902.06888

parity (N=20) data set

6 JAMES STOKES AND JOHN TERILLA

The exact single-site DMRG algorithm begins with an initial vector y0 2M and pro-duces y1,y2, . . . inductively by solving an effective problem

(13) Ht+1 := at+1(Heff,t+1)

which we now describe. Let us drop the subscript t + 1 from the isometry at+1 and theeffective Hilbert space Heff,t+1 in the relevant effective problem—just be aware that theembedding

(14) a : Heff !H

will change from step to step. The map a is defined using an MPS factorization of ytin mixed canonical form relative to a fixed site which varies at each step according to apredetermined schedule. For the purposes of illustration, the third site is the fixed site inthe pictures below.

(15)yt =

The effective space is Heff =W ⌦V ⌦W and the isometric embedding a : W ⌦V ⌦W !

V⌦N is defined for any f 2W ⌦V ⌦W by replacing the tensor at the fixed site of yt withf :

(16) a

To see that a is an isometry, use the gauge condition that the MPS factorization of yt is inmixed canonical form relative to the fixed site, as illustrated below:

(17) ha(f),a(f 0)i= = = hf ,f 0i.

The adjoint map a⇤ : V⌦N!W ⌦V ⌦W has a clean pictorial depiction as well.

(18)a⇤

To see that a⇤ as pictured above is, in fact, the adjoint of a , note that for any h 2H andany f 2Heff, both hh ,a(f)i and ha⇤(h),fi result in the same tensor contraction:

(19) hh ,a(f)i= = ha⇤(h),fi

In the picture above, begin with the blue tensors. Contracting with the yellow tensor givesa(f) and then contracting with the red tensor gives hh ,a(f)i. On the other hand, firstcontracting with the red tensor yields a⇤(h) resulting in ha⇤(h),fi after contracting withthe yellow tensor.

4 JAMES STOKES AND JOHN TERILLA

y0

ybp

(A) The initial vector y0 and the vector ybplie in the unit sphere of H.

y0

ybp

y1

(B) The vector y0 is used to define the sub-space H1. The unit vectors in H1 define alower dimensional sphere in H (in blue). Thevector y1 is the vector in that sphere that isclosest to ybp .

y0

ybp

y1

y2

(C) The vector y1 is used to define the sub-space H2. The unit sphere in H2 (in blue)contains y1 but does not contain y0. Thevector y2 is the unit vector in H2 closest toybp .

y0

ybp

y1

y2

y3

(D) The vector y2 is used to define the sub-space H3. The vector y3 is the unit vector inH3 closest to ybp . And so on.

FIGURE 1. A bird’s eye view of the training dynamics of exact single-site DMRG on the unit sphere.

Let a : Heff !H be an isometric embedding of a Hilbert space Heff into H. We refer toHeff as the effective Hilbert space. The isometry a and its adjoint map a⇤ are summarizedby the following diagram,

Heff H

a

idHeff

a⇤

P

The composition a⇤a = idHeff is the identity on Heff. The composition in the other orderaa⇤ is an orthogonal projection onto a(Heff) which is a subspace of H isometrically

Page 91: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Concatenating multiple MPS (string-bond) and pre-processing with a CNN layer gives state-of-the-art on Fashion MNIST:

10

FIG. 12. Using convolutional Neural Networks as feature vec-tor extractors from real data: the output of the CNN is seenas an image with a third dimension collecting the di↵erentfeatures. For each pixel of this image, the vector of featuresis contracted with the open legs of a tensor network.

V. NUMERICAL EXPERIMENTS

We test the generalized tensor network approachon the task of image classification, where a naturaltwo-dimensional geometry that can be reflected inthe architecture of the tensor network is present, aswell as on the task of urban sound recognition, wherethe time dimension provides a one-dimensional geometry.

A. Image classification

We first consider the MNIST dataset[48], which con-sists of 28 ⇥ 28 greyscale images of digits. There are 10classes and we adopt a multiclass classification procedurein which one tensor of the tensor network is parametrizedby the ten possible labels. The original training set issplit into training and validation sets of 55000 and 5000examples and the performance of the di↵erent models isevaluated on the test set of 10000 examples. We con-sider the following generalized tensor networks: a snake-SBS with 4 strings (Fig. 5b), a 2D-SBS (Fig. 5a), anEPS with a 2⇥ 2 translational-invariant plaquette com-bined with a linear classifier, (Fig. 7c), an EPS-SBS withtranslational-invariant plaquette combined with a snake-SBS (Fig. 6) and a CNN-snake-SBS which uses a 1-layerCNN as input features (Fig. 12). The CNN consideredhere uses a convolutional layer applying 6 5 ⇥ 5 filters(stride 1) with ReLU activation function and a poolinglayer performing max pooling with a 2 ⇥ 2 filter. Allother networks use the choice of features presented inEq. (26) and the greyscale values are normalized between0 and 1. We compare the performance of these networkswith a MPS and a RBM (the number of hidden units of250, 500, 750 or 1000 is taken as a hyperparameter). Allnetworks use a batch size of 20 examples and hyperpa-rameters such as the learning rate ↵, the regularizationrate � and number of iterations over the training set aredetermined through a grid search while evaluating theperformance on the validation set. Best performance istypically achieved with ↵ = 10�4, � = 0.95 and a hun-

(a) (b)

FIG. 13. Examples of images from the MNIST (a) and fashionMNIST (b) dataset.

FIG. 14. Test set accuracy of di↵erent generalized tensornetworks on the MNIST dataset.

dred iterations.The test set accuracy, presented in Fig. 14, shows that

even with a very small bond dimension generalized ten-sor network are able to accurately classify the dataset.Their performance is significantly better than that of atree tensor network[46] or a MPS trained in frequencyspace[45], and while a MPS can also achieve 99.03% ac-curacy with a bond dimension of 120 [15], the cost ofoptimizing very large tensors has prohibited the use ofthis method for larger problems so far. The snake-SBSwith bond dimension larger than 6 has also better per-formance than a RBM. Since the snake-SBS provides aninterpolation between RBM and MPS, the choice of num-ber of strings and geometry can be seen as additional pa-rameters which could be tuned further to improve overthe performance of both methods. All networks have atraining set accuracy very close to 100% when the bonddimension is larger than 6, and we expect that betterregularization techniques or network architectures haveto be developed to significantly increase the test set per-formances obtained here. We also optimized a snake-SBSwith positive elements (by parametrizing each element ina tensor as the exponential of the new parameters), whichis a graphical model. Using the same algorithm, we werenot able to achieve better performance than 93% classi-fication accuracy with bond dimensions up to 10. Thisshows that while having a structure closely related to

fashion MNIST

Method Test Accuracy

AlexNet 88.90%

Tree tensor net. 1 88.97%

String bond state 2 89.2%

CNN-String bond state 2 92.3%

GoogLeNet 93.7%

Glasser, Pancotti, Cirac, arxiv:1806.05964

Page 92: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Extension #1: hierarchical optimization of tensor network unsupervised / supervised hybrid

5

covariance matrix can be seen in the last two panels ofFig. 7. The manipulations there show the fidelity F1 canbe written in terms of ⇢12 as

F1 =X

s1s2s01s0

2t

U † t1 s0

1s02⇢

s01s0

212 s1s2U

s1s21 t . (17)

It follows that the optimal isometry U1 can be computedby diagonalizing ⇢12 as

⇢12 = U1P12U†1 , (18)

here viewing ⇢12 as a matrix with row index (s1s2) andcolumn index (s0

1s02) as shown in Fig. 8. The matrix

P12 is a diagonal matrix whose diagonal elements are theeigenvalues of ⇢12. After the diagonalization, the columnsof U1 are chosen to be the eigenvectors corresponding tothe D largest eigenvalues of ⇢12. Let the rank of thematrix ⇢12 be r and call its eigenvalues {pi}r

i=1. Oneway to determine the number D of eigenvalues to keepis to choose a predetermined threshold ✏ and define Dsuch that the truncation error E is less than ✏, wherethe truncation error is defined as

E =

Pri=D pi

Tr[⇢12]< ✏ . (19)

This is the same procedure proposed by White in thecontext of the density matrix renormalization group

F1 =

j

j

1

NT

NTX

j=1

=

U1

U †1

j

j

1

NT

NTX

j=1

=

F =

FIG. 7. The fidelity F is defined as the average inner productof the training set feature vectors, or equivalently the traceof the covariance matrix ⇢. The isometry U1 is chosen tomaximize the fidelity following coarse graining (second panelabove) which is equivalent to maximizing the trace of ⇢12 aftercoarse graining (last panel above).

=X

j

⇢12 =

s1 s2

s01 s0

2

s1 s2

s01 s0

2

⇢12 =

s1 s2

s01 s0

2

s1 s2

s01 s0

2

= P12

U12

U †12

(a)

(b)

s03

=⇢34 =X

j

s04

s3 s4

s03 s0

4

s3 s4

(c)

FIG. 8. Definition (a) of the reduced covariance matrix ⇢12;(b) computation of the optimal isometry U1 by diagonalizing⇢12 and truncating its smallest eigenvalues; (c) definition ofthe reduced covariance matrix ⇢34.

(DMRG) algorithm used in quantum mechanics, wherethe �s

j is analogous to an ensemble of wavefunctions enu-merated by j; ⇢ is the full density matrix; and ⇢12 is areduced density matrix [24].

To compute the remaining isometries which will formthe first layer, the procedure continues in an analogousfashion by next computing the reduced covariance matrix⇢34 as in Fig. 8(c) and diagonalizing it to obtain the isom-etry U34. Note that the calculation of the reduced covari-ance matrices as well as the individual summations overthe training data used to produce them can be performedin parallel. What is more, we find that when summingover the training data in a random order, the reducedcovariance matrices typically converge before summingover the entire training set, and this convergence can bemonitored to ensure a controlled approximation. Afterdiagonalizing the reduced covariance matrices for everypair of local indices (s2i�1, s2i), one obtains the first layerof isometries depicted in Fig. 9.

The isometry layer can now be used to coarse graineach of the training set feature vectors �(xj). After the

�(x)

= �1(x)

FIG. 9. Having determined a layer of isometries, these isome-tries can be used to coarse grain the feature vectors.

Data feature map:3

(c)

�(x) =(a)

f(x) = W

�(x)(b)

W =

FIG. 3. Choosing the feature map (a) to be a tensor productof local feature maps leads to a model f(x) of the form (b)where the weight parameters W have (c) the structure of anorder-N tensor.

size, then many singular values sn will be very small orzero and the corresponding rows of U† can be discarded.Following such a truncation, Eq. (5) says that to a goodapproximation, the optimal weights can be parameter-ized within a significantly reduced space of parameters�n and U† is the transformation from the entire featurespace to the reduced parameter space.

Computing the singular value decomposition of �sj di-

rectly would not scale well for large training sets or high-dimensional feature maps, yet as we will show it is never-theless possible to e�ciently determine the transforma-tion U in truncated form. Observe that U diagonalizesthe feature space covariance matrix [50] defined as

⇢s0

s =1

NT

NTX

j=1

�s0

j �†js (6)

=X

n

U s0

n Pn U†ns (7)

where Pn = (Snn)2 are the eigenvalues of the Hermitian

matrix ⇢. As we demonstrate in Sec. III below, the fea-ture space covariance matrix ⇢ is amenable to decomposi-tion as a layered tensor network. Computing every layerof this network can provide an e�cient expression forthe elements of the basis U corresponding to the largesteigenvalues of ⇢. Computing only some of the layers stillhas the beneficial e↵ect of projecting out directions infeature space along which ⇢ has small or zero eigenval-ues. By carrying out an iterative procedure to truncatedirections in feature space along which ⇢ has a very smallprojection, one can rapidly reduce the size of the spaceneeded to carry out learning tasks.

We will also see that ⇢ is not the only choice of ma-trix for determining a tensor network basis for features.As demonstrated in Sec. V, other choices result in a net-work more adapted for a specific task, and can have fewerlatent parameters without reducing model performance.

B. Tensor Product Feature Maps

Before describing the algorithm to partially or fullydiagonalize ⇢ as a tensor network, we briefly review theclass of feature maps which lead to a natural represen-tation of model parameters as a tensor network, as dis-cussed in Refs. 4–6. These are feature maps �(x) whichmap inputs x from a space of dimension N into a spaceof dimension dN with a tensor product structure. Thesimplest case of such a map begins by defining a localfeature map �sj (xj) where sj = 1, 2, . . . , d. These localfeature maps define the full feature map as:

�s1s2···sN (x) = �s1(x1)�s2(x2) · · · �sN (xN ) (8)

as shown in Fig. 3(a), where placement of tensors nextto each other implies an outer product. This choice offeature map leads to models of the form

f(x) =X

s1s2···sN

Ws1s2···sN �s1(x1)�s2(x2) · · · �sN (xN )

(9)

which are depicted in Fig. 3(b). As evident from theabove expression, the weight parameters are indexed byN indices of dimension d. Thus there are dN weightparameters and W is a tensor of order N . We will beinterested in the case where d is small (of order one orten) and N is many hundreds or thousands in size.

Of course, manipulating or even storing dN parametersquickly becomes impossible as N increases. A solutionthat is both practical and interesting is to assume thatthe optimal weights W can be e�ciently approximatedby a tensor network [12, 28], an idea proposed recentlyby several groups [4–6, 36].

A tensor network is a factorization of an order N tensorinto the contracted product of low-order tensors. Keyexamples of well-understood tensor networks for whiche�cient algorithms are known are depicted in Fig. 2 andinclude:

• the matrix product state (MPS) [19–21, 51] or ten-sor train decomposition [52], Fig. 2(a)

• the PEPS tensor network [22], Fig. 2(b)

• the tree tensor network [53, 54] or hierarchicalTucker decomposition [55], Fig. 2(c)

• the MERA tensor network [23, 31], Fig. 2(d).

Each of these networks makes various tradeo↵s in termsof how complicated they are to manipulate versus theirability to represent statistical systems with higher-dimensional interactions or more slowly decaying corre-lations. A good introduction to tensor networks in thephysics context is given by Orus in Ref. 12 and in a math-ematics context by Cichocki in Ref. 56. Other detailedreviews include Refs. 28, 32, 57, and 58.

x<latexit sha1_base64="4zwzgokxf3WY/V4pft6LY3cqhHg=">AAACwHicdVFNb9NAEN2Yr5BSaOHIZYWF1FNkp0hwrIqQOBaJpBWxFY0342SV/TC765bg+F9wBfG3+Des3RzSJhlppac37+2b3ckKwa2Lon+d4MHDR4+fdJ/2Dp4dPn9xdPxyZHVpGA6ZFtpcZWBRcIVDx53Aq8IgyEzgZbb42PQvr9FYrtVXtywwlTBTPOcMnKe+JRLcPMurH/XkKIz6UVt0G8RrEJJ1XUyOO3+TqWalROWYAGvHcVS4tALjOBNY95LSYgFsATMce6hAok2rduSavvXMlOba+KMcbdlNRwXS2qXMvLIZ0d7vNeSu3rh0+Ye04qooHSp2G5SXgjpNm/fTKTfInFh6AMxwPytlczDAnP+lXqLwhmkpQU2rZIGuHsdplaCypcEmq1qFcWJAzfwD67vqzMCWOhGtNIxXO9Tt9YOdBuod4YDuSYLr2b4kb9xw+Z3G9ze4DUaDfnzaj768C88+rbfbJa/JG3JCYvKenJHP5IIMCSOK/CK/yZ/gPJgHOvh+Kw06a88rcqeCn/8B2xLgBw==</latexit>

Deterministic tensor learning:

Resulting model – train top tensors for supervised objective:

89% accuracy on Fashion MNIST data set

Stoudenmire, Quant. Sci. Tech. 3, 034003 (2018) [arxiv:1801.00315]

8

s1 s2

s01 s0

2

=

+

= P12

U12

U †12

µX

j

(1 � µ)

s1 s2

s01 s0

2

s1 s2

s01 s0

2

s01 s0

2

s1 s2

⇢µ12 = +(1 � µ) ⇢12 µ ⇢W12

FIG. 11. In the mixed unsupervised/supervised algorithm fordetermining the tree tensors making up U , the reduced co-variance matrices are a weighted sum of the reduced trainingdata covariance matrix and reduced covariance matrix fromthe provided supervised weights in MPS form. The figureabove shows the computation of the mixed reduced covari-ance for the first two sites; the computation for other pairsof sites is similar just with di↵erent choices for which indicesare traced or left open.

The key algorithmic di↵erence from the unsupervisedcase is that after determining each layer, one must alsocoarse grain the provided weights W along with the train-ing data so one can compute the reduced covariance ma-trices from ⇢µ at the next scale. Although the weights Win MPS form have additional internal indices as shownin Fig. 2(a), it is straightforward to coarse grain an MPSwith a tree tensor network layer: one simply contractseach isometry with pairs of MPS tensors.

For the case of a multi-class supervised task there willbe multiple prior weight MPS W `, one for each label `(or one can equivalently provide a single MPS with anexternal or uncontracted label index). To generalize theabove algorithm to the multi-task setting, one definesthe covariance matrix ⇢W as the sum over the covariancematrices of each of the prior supervised weights W `

(⇢W )ss0 =X

`

W †s` W `

s0 . (28)

To test whether the strategy of mixing in a prior es-timate of the supervised task weights results in an im-proved model, we experiment again on the MNIST hand-written digits data set. Using a mixing parameter µ = 0.5and a truncation error cuto↵ ✏ = 4⇥10�4 results in a treetensor network with top index sizes 279 and 393, whereafter making the tree layers in a single pass, only thetop tensor is optimized further for the supervised task.Despite the top index sizes being significantly smallerthan those for the best experiment in Sec. IV (wherethe sizes were 328 and 444), the results are slightly bet-ter: the cost function value is C = 0.0325, training setaccuracy is 99.798%, and test set accuracy is 98.110%.This experiment strongly suggests that mixing weightstrained for the supervised task with the covariance ma-trix based purely on the data leads to a representation of

the data more suited for the specific task, which can becompressed further without diminishing performance.

VI. PARTIAL COARSE GRAINING: TREECURTAIN MODEL

While the approaches in the previous sections involvedcomputing tree tensor networks with the maximum num-ber of layers, computing fewer layers can balance the ben-efits of a compressed data representation against the lossof expressiveness from accumulated truncations whencomputing more layers.

One interesting aspect of computing fewer tree lay-ers is that after coarse graining, the data are still repre-sented as high-order tensors, similar to Fig. 9. Specifi-cally, the order of the data tensors after R rescalings willbe Ntop = N/2R. Therefore to complete the model, thetop tensor must also be a tensor of order Ntop if the out-put is to be a scalar, or order Ntop + 1 for vector-valuedoutput in the multi-task case. So to complete the modelone can use another type of tensor network to representthe top tensor, such as a matrix product state.

Choosing a matrix product state (MPS) form of thetop tensor results in the architecture shown in Fig. 12.After coarse graining the training data through the treelayers, one can optimize the top MPS using previouslydeveloped methods for supervised [5, 6] or unsupervised[39] learning tasks. The resulting model resembles anMPS with a tree “curtain” attached.

To test the e↵ectiveness of the partial coarse-grainedapproach, and the resulting tree-curtain model, we studythe fashion MNIST dataset [60]. Similarly to MNIST,the data set consists of 28⇥28 grayscale images with tenlabels, with 60,000 training and 10,000 test images. How-ever, the supervised learning task is significantly morechallenging than MNIST because the images consist ofphotographs of a variety of clothing (shirts, shoes, etc.).Example images from the data set are shown in Fig. 13.

To compute a tree curtain model for fashion MNIST,we first training a linear classifier resulting in only 83%test accuracy. We then represent each of the linear clas-sifier vectors V ` as MPS and used the mixed covariancematrix approach (Sec. V) with mixing parameter µ = 0.9to optimize four tree tensor layers. Using a truncation

`f `(x) =

�(x)

U

}

}w`

FIG. 12. Computing only a few tree layers results in a modelwith a high-order top tensor. In the model above the toptensor w` is represented as a matrix product state with alabel index ` appropriate for the multi-task case.

Page 93: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Extension #2: algorithm for quantum machine learning identical to optimization of a tensor network!

Huggins, Patel, Whaley, Stoudenmire, Quant. Sci. Tech. 4, 024001 (2019) [arxiv:1803.11537]

Qubit efficient scheme

Product state

Measured qubits

\Huggins, Patel, Whaley, Stoudenmire, 1803.11537see also Cramer et al, Nat. Comm. 2010

Tensor network inspired quantum circuit architecture

Qubit efficient scheme

\Huggins, Patel, Whaley, Stoudenmire, 1803.11537

Product state

Measured qubits

see also Cramer et al, Nat. Comm. 2010

Reuse!

qubit efficient scheme:

Qubit efficient scheme

\Huggins, Patel, Whaley, Stoudenmire, 1803.11537

Product state

Measured qubits

8

(a)

(b)

(c)

(d)

FIG. 12. Mapping of the generative matrix product state(MPS) quantum circuit with V = 3 to a bond dimensionD = 23 MPS tensor network diagram. First (a) interpret thecircuit diagram as a tensor diagram by interpreting referencestates h0| as vectors [1, 0]; qubit lines as dimension 2 tensorindices; and measurements as setting indices to fixed values.Then (b) contract the reference states into the unitary tensorsand (c) redraw the tensors in a linear chain. Finally, (d) mergethree D = 2 indices into a single D = 8 dimensional index oneach bond.

Given the ability to measure and reset a subset of phys-ical qubits, a key advantage of implementing a discrim-inative or generative tensor network model based on anMPS is that for a model with V virtual qubits, an arbi-trary number of inputs or outputs can be processed byusing only V +1 physical qubits. The circuits illustratinghow this can be done are shown in Fig. 11.

The implementation of the discriminative algorithmshown in Fig. 11(a) begins by preparing and entanglingV input qubit states. One of the qubits is measured andreset to the next input state. Then all V + 1 qubits areentangled and a single qubit measured and re-prepared.Continuing in this way, one can process all of the inputs.Once all inputs are processed, the model output is ob-tained by sampling one or more of the physical qubits.

To implement the generative MPS algorithm shown inFig. 11(b), one prepares all qubits to a reference state

|0i⌦V +1 and after entangling the qubits, one measuresand records a single qubit to generate the first outputvalue. This qubit is reset to the state |0i and all thequbits are then acted on by another (V + 1) qubit uni-tary. A single qubit is again measured to generate thesecond output value, and the algorithm continues untilN outputs have been generated.

To understand the equivalence of the generative circuitof Fig. 11(b) to conventional tensor diagram notation foran MPS, interpret the circuit diagram Fig. 12(a) as a ten-sor network diagram, treating elements such as referencestates h0| as tensors or vectors [1, 0]. One can contractor sum over the reference state indices and merge any V

qubit indices into a single index of dimension D = 2V .The result is a standard MPS tensor network diagramFig. 12(d) for the amplitude of observing a particular setof values of the measured qubits.

C. Noise Resilience

Any implementation of our proposed approach onnear-term quantum hardware will have to contend witha significant level of noise due to qubit and gate imper-fections. But one intuition about noise e↵ects in ourtree models is that an error which corrupts a qubit onlyscrambles the information coming from the patch of in-puts belonging to the past “causal cone” of that qubit.And because the vast majority of the operations occurnear the leaves of the tree, the most likely errors there-fore correspond to scrambling only small patches of theinput data. We note that a good classifier should nat-urally be robust to small deformations and corruptionsof the input, and, in fact, adding various kinds of noiseduring training is a commonly used strategy in classicalmachine learning. Based on these intuitions, we expectour circuits could demonstrate a high level of toleranceto noise.

In order to quantitatively understand the robustness ofour proposed approach to noise on quantum hardware,

FIG. 13. The test accuracy for each of the pairwise classifiersunder noise corresponding to a T1 of 5µs, a T2 of 7µs, anda gate time of 200 ns. In most cases, the accuracy is compa-rable to the results from training without noise. Note that itwas necessary to choose a di↵erent set of hyper-parameters toenable successful training under noise.

8

(a)

(b)

(c)

(d)

FIG. 12. Mapping of the generative matrix product state(MPS) quantum circuit with V = 3 to a bond dimensionD = 23 MPS tensor network diagram. First (a) interpret thecircuit diagram as a tensor diagram by interpreting referencestates h0| as vectors [1, 0]; qubit lines as dimension 2 tensorindices; and measurements as setting indices to fixed values.Then (b) contract the reference states into the unitary tensorsand (c) redraw the tensors in a linear chain. Finally, (d) mergethree D = 2 indices into a single D = 8 dimensional index oneach bond.

Given the ability to measure and reset a subset of phys-ical qubits, a key advantage of implementing a discrim-inative or generative tensor network model based on anMPS is that for a model with V virtual qubits, an arbi-trary number of inputs or outputs can be processed byusing only V +1 physical qubits. The circuits illustratinghow this can be done are shown in Fig. 11.

The implementation of the discriminative algorithmshown in Fig. 11(a) begins by preparing and entanglingV input qubit states. One of the qubits is measured andreset to the next input state. Then all V + 1 qubits areentangled and a single qubit measured and re-prepared.Continuing in this way, one can process all of the inputs.Once all inputs are processed, the model output is ob-tained by sampling one or more of the physical qubits.

To implement the generative MPS algorithm shown inFig. 11(b), one prepares all qubits to a reference state

|0i⌦V +1 and after entangling the qubits, one measuresand records a single qubit to generate the first outputvalue. This qubit is reset to the state |0i and all thequbits are then acted on by another (V + 1) qubit uni-tary. A single qubit is again measured to generate thesecond output value, and the algorithm continues untilN outputs have been generated.

To understand the equivalence of the generative circuitof Fig. 11(b) to conventional tensor diagram notation foran MPS, interpret the circuit diagram Fig. 12(a) as a ten-sor network diagram, treating elements such as referencestates h0| as tensors or vectors [1, 0]. One can contractor sum over the reference state indices and merge any V

qubit indices into a single index of dimension D = 2V .The result is a standard MPS tensor network diagramFig. 12(d) for the amplitude of observing a particular setof values of the measured qubits.

C. Noise Resilience

Any implementation of our proposed approach onnear-term quantum hardware will have to contend witha significant level of noise due to qubit and gate imper-fections. But one intuition about noise e↵ects in ourtree models is that an error which corrupts a qubit onlyscrambles the information coming from the patch of in-puts belonging to the past “causal cone” of that qubit.And because the vast majority of the operations occurnear the leaves of the tree, the most likely errors there-fore correspond to scrambling only small patches of theinput data. We note that a good classifier should nat-urally be robust to small deformations and corruptionsof the input, and, in fact, adding various kinds of noiseduring training is a commonly used strategy in classicalmachine learning. Based on these intuitions, we expectour circuits could demonstrate a high level of toleranceto noise.

In order to quantitatively understand the robustness ofour proposed approach to noise on quantum hardware,

FIG. 13. The test accuracy for each of the pairwise classifiersunder noise corresponding to a T1 of 5µs, a T2 of 7µs, anda gate time of 200 ns. In most cases, the accuracy is compa-rable to the results from training without noise. Note that itwas necessary to choose a di↵erent set of hyper-parameters toenable successful training under noise.

D = 23see also Cramer et al, Nat. Comm. 2010

Reuse!

MPS with exponentially large bond dimensions

Page 94: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Related idea: infinite DMRG on quantum computers with finite number of physical qubits

Barratt, Dborin, Bal, Stojevic, Pollmann, Green, arxiv: 2003.12087

Some of these techniques could be used for machine learning too

Page 95: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Review: Notable Recent Works Using Tensor Network Machine Learning

Page 96: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Unsupervised Generative Modeling Using MPS

• Map data to product state, tensor network weights • Squared output is probability – "Born machine" • "Perfect" sampling (no autocorrelation)

x1<latexit sha1_base64="FedYOKFDb3cBiGr2oJh8QJo2ugY=">AAACuHicdVFNSwMxEE3X7/qtRy/BRfBUdougIojgxaOiVaG7lNl0uoYm2SXJqmXtT/CqV/+W/8Z07UFtHQg83ryXN8kkueDGBsFnzZuZnZtfWFyqL6+srq1vbG7dmqzQDFssE5m+T8Cg4ApblluB97lGkInAu6R/PurfPaI2PFM3dpBjLCFVvMcZWEddP3fCzoYfNIKq6CQIx8An47rsbNY+om7GConKMgHGtMMgt3EJ2nImcFiPCoM5sD6k2HZQgUQTl9WsQ7rnmC7tZdodZWnF/nSUII0ZyMQpJdgH87c3Iqf12oXtHcUlV3lhUbHvoF4hqM3o6OG0yzUyKwYOANPczUrZA2hg1n1PPVL4xDIpQXXLqI922A7jMkJlCo2jrPLFDyMNKnUPHP5WJxom1JGopH74MkVdXd+caqDO4TfpP0nwmP6X5Iw/XG6n4d8NToJWs3HcCK4O/LPT8XIXyQ7ZJfskJIfkjFyQS9IijKTklbyRd+/EAy/1+LfUq4092+RXefoLZeHcVQ==</latexit><latexit sha1_base64="FedYOKFDb3cBiGr2oJh8QJo2ugY=">AAACuHicdVFNSwMxEE3X7/qtRy/BRfBUdougIojgxaOiVaG7lNl0uoYm2SXJqmXtT/CqV/+W/8Z07UFtHQg83ryXN8kkueDGBsFnzZuZnZtfWFyqL6+srq1vbG7dmqzQDFssE5m+T8Cg4ApblluB97lGkInAu6R/PurfPaI2PFM3dpBjLCFVvMcZWEddP3fCzoYfNIKq6CQIx8An47rsbNY+om7GConKMgHGtMMgt3EJ2nImcFiPCoM5sD6k2HZQgUQTl9WsQ7rnmC7tZdodZWnF/nSUII0ZyMQpJdgH87c3Iqf12oXtHcUlV3lhUbHvoF4hqM3o6OG0yzUyKwYOANPczUrZA2hg1n1PPVL4xDIpQXXLqI922A7jMkJlCo2jrPLFDyMNKnUPHP5WJxom1JGopH74MkVdXd+caqDO4TfpP0nwmP6X5Iw/XG6n4d8NToJWs3HcCK4O/LPT8XIXyQ7ZJfskJIfkjFyQS9IijKTklbyRd+/EAy/1+LfUq4092+RXefoLZeHcVQ==</latexit><latexit sha1_base64="FedYOKFDb3cBiGr2oJh8QJo2ugY=">AAACuHicdVFNSwMxEE3X7/qtRy/BRfBUdougIojgxaOiVaG7lNl0uoYm2SXJqmXtT/CqV/+W/8Z07UFtHQg83ryXN8kkueDGBsFnzZuZnZtfWFyqL6+srq1vbG7dmqzQDFssE5m+T8Cg4ApblluB97lGkInAu6R/PurfPaI2PFM3dpBjLCFVvMcZWEddP3fCzoYfNIKq6CQIx8An47rsbNY+om7GConKMgHGtMMgt3EJ2nImcFiPCoM5sD6k2HZQgUQTl9WsQ7rnmC7tZdodZWnF/nSUII0ZyMQpJdgH87c3Iqf12oXtHcUlV3lhUbHvoF4hqM3o6OG0yzUyKwYOANPczUrZA2hg1n1PPVL4xDIpQXXLqI922A7jMkJlCo2jrPLFDyMNKnUPHP5WJxom1JGopH74MkVdXd+caqDO4TfpP0nwmP6X5Iw/XG6n4d8NToJWs3HcCK4O/LPT8XIXyQ7ZJfskJIfkjFyQS9IijKTklbyRd+/EAy/1+LfUq4092+RXefoLZeHcVQ==</latexit>

x2<latexit sha1_base64="W0a/NhSG5s1nLxQWD+he2qxFls0=">AAACuHicdVFNSwMxEE3X7/qtRy/BRfBUdougIojgxaOiVaG7lNl0uoYm2SXJqmXtT/CqV/+W/8Z07UFtHQg83ryXN8kkueDGBsFnzZuZnZtfWFyqL6+srq1vbG7dmqzQDFssE5m+T8Cg4ApblluB97lGkInAu6R/PurfPaI2PFM3dpBjLCFVvMcZWEddP3eanQ0/aARV0UkQjoFPxnXZ2ax9RN2MFRKVZQKMaYdBbuMStOVM4LAeFQZzYH1Ise2gAokmLqtZh3TPMV3ay7Q7ytKK/ekoQRozkIlTSrAP5m9vRE7rtQvbO4pLrvLComLfQb1CUJvR0cNpl2tkVgwcAKa5m5WyB9DArPueeqTwiWVSguqWUR/tsB3GZYTKFBpHWeWLH0YaVOoeOPytTjRMqCNRSf3wZYq6ur451UCdw2/Sf5LgMf0vyRl/uNxOw78bnAStZuO4EVwd+Gen4+Uukh2yS/ZJSA7JGbkgl6RFGEnJK3kj796JB17q8W+pVxt7tsmv8vQXaDPcVg==</latexit><latexit sha1_base64="W0a/NhSG5s1nLxQWD+he2qxFls0=">AAACuHicdVFNSwMxEE3X7/qtRy/BRfBUdougIojgxaOiVaG7lNl0uoYm2SXJqmXtT/CqV/+W/8Z07UFtHQg83ryXN8kkueDGBsFnzZuZnZtfWFyqL6+srq1vbG7dmqzQDFssE5m+T8Cg4ApblluB97lGkInAu6R/PurfPaI2PFM3dpBjLCFVvMcZWEddP3eanQ0/aARV0UkQjoFPxnXZ2ax9RN2MFRKVZQKMaYdBbuMStOVM4LAeFQZzYH1Ise2gAokmLqtZh3TPMV3ay7Q7ytKK/ekoQRozkIlTSrAP5m9vRE7rtQvbO4pLrvLComLfQb1CUJvR0cNpl2tkVgwcAKa5m5WyB9DArPueeqTwiWVSguqWUR/tsB3GZYTKFBpHWeWLH0YaVOoeOPytTjRMqCNRSf3wZYq6ur451UCdw2/Sf5LgMf0vyRl/uNxOw78bnAStZuO4EVwd+Gen4+Uukh2yS/ZJSA7JGbkgl6RFGEnJK3kj796JB17q8W+pVxt7tsmv8vQXaDPcVg==</latexit><latexit sha1_base64="W0a/NhSG5s1nLxQWD+he2qxFls0=">AAACuHicdVFNSwMxEE3X7/qtRy/BRfBUdougIojgxaOiVaG7lNl0uoYm2SXJqmXtT/CqV/+W/8Z07UFtHQg83ryXN8kkueDGBsFnzZuZnZtfWFyqL6+srq1vbG7dmqzQDFssE5m+T8Cg4ApblluB97lGkInAu6R/PurfPaI2PFM3dpBjLCFVvMcZWEddP3eanQ0/aARV0UkQjoFPxnXZ2ax9RN2MFRKVZQKMaYdBbuMStOVM4LAeFQZzYH1Ise2gAokmLqtZh3TPMV3ay7Q7ytKK/ekoQRozkIlTSrAP5m9vRE7rtQvbO4pLrvLComLfQb1CUJvR0cNpl2tkVgwcAKa5m5WyB9DArPueeqTwiWVSguqWUR/tsB3GZYTKFBpHWeWLH0YaVOoeOPytTjRMqCNRSf3wZYq6ur451UCdw2/Sf5LgMf0vyRl/uNxOw78bnAStZuO4EVwd+Gen4+Uukh2yS/ZJSA7JGbkgl6RFGEnJK3kj796JB17q8W+pVxt7tsmv8vQXaDPcVg==</latexit>

x3<latexit sha1_base64="a9UKL/2cnCMDla+QUCRBfwdp0GQ=">AAACuHicdVFNTxsxEHW2LdDQQmiPXKyuKnGKdlMkQEgoEpceQTQNUnYVzTqTxYrtXdleINrkJ3CFa/9W/w3OkgP5GsnS05v3/MaeJBfc2CD4X/M+fPy0tb3zub775evefuPg21+TFZphh2Ui07cJGBRcYcdyK/A21wgyEdhNRpezfvceteGZ+mPHOcYSUsWHnIF11M1j/1e/4QfNoCq6CsI58Mm8rvoHtX/RIGOFRGWZAGN6YZDbuARtORM4rUeFwRzYCFLsOahAoonLatYp/emYAR1m2h1lacW+d5QgjRnLxCkl2Duz3JuR63q9wg5P45KrvLCo2FvQsBDUZnT2cDrgGpkVYweAae5mpewONDDrvqceKXxgmZSgBmU0QjvthXEZoTKFxllWOfHDSINK3QOni+pEw4o6EpXUDydr1NX1rbUG6hx+i25Igvt0U5IzvnO5nYbLG1wFnVbzrBlcH/vti/lyd8gh+UGOSEhOSJv8JlekQxhJyRN5Ji/euQde6vE3qVebe76ThfL0K2qF3Fc=</latexit><latexit sha1_base64="a9UKL/2cnCMDla+QUCRBfwdp0GQ=">AAACuHicdVFNTxsxEHW2LdDQQmiPXKyuKnGKdlMkQEgoEpceQTQNUnYVzTqTxYrtXdleINrkJ3CFa/9W/w3OkgP5GsnS05v3/MaeJBfc2CD4X/M+fPy0tb3zub775evefuPg21+TFZphh2Ui07cJGBRcYcdyK/A21wgyEdhNRpezfvceteGZ+mPHOcYSUsWHnIF11M1j/1e/4QfNoCq6CsI58Mm8rvoHtX/RIGOFRGWZAGN6YZDbuARtORM4rUeFwRzYCFLsOahAoonLatYp/emYAR1m2h1lacW+d5QgjRnLxCkl2Duz3JuR63q9wg5P45KrvLCo2FvQsBDUZnT2cDrgGpkVYweAae5mpewONDDrvqceKXxgmZSgBmU0QjvthXEZoTKFxllWOfHDSINK3QOni+pEw4o6EpXUDydr1NX1rbUG6hx+i25Igvt0U5IzvnO5nYbLG1wFnVbzrBlcH/vti/lyd8gh+UGOSEhOSJv8JlekQxhJyRN5Ji/euQde6vE3qVebe76ThfL0K2qF3Fc=</latexit><latexit sha1_base64="a9UKL/2cnCMDla+QUCRBfwdp0GQ=">AAACuHicdVFNTxsxEHW2LdDQQmiPXKyuKnGKdlMkQEgoEpceQTQNUnYVzTqTxYrtXdleINrkJ3CFa/9W/w3OkgP5GsnS05v3/MaeJBfc2CD4X/M+fPy0tb3zub775evefuPg21+TFZphh2Ui07cJGBRcYcdyK/A21wgyEdhNRpezfvceteGZ+mPHOcYSUsWHnIF11M1j/1e/4QfNoCq6CsI58Mm8rvoHtX/RIGOFRGWZAGN6YZDbuARtORM4rUeFwRzYCFLsOahAoonLatYp/emYAR1m2h1lacW+d5QgjRnLxCkl2Duz3JuR63q9wg5P45KrvLCo2FvQsBDUZnT2cDrgGpkVYweAae5mpewONDDrvqceKXxgmZSgBmU0QjvthXEZoTKFxllWOfHDSINK3QOni+pEw4o6EpXUDydr1NX1rbUG6hx+i25Igvt0U5IzvnO5nYbLG1wFnVbzrBlcH/vti/lyd8gh+UGOSEhOSJv8JlekQxhJyRN5Ji/euQde6vE3qVebe76ThfL0K2qF3Fc=</latexit>

x4<latexit sha1_base64="0hCzLPDj61quhD03PTcBJHYacqQ=">AAACuHicdVFNSwMxEE3X7/qtRy/BRfBUdougIojgxaOiVaG7lNl0uoYm2SXJqmXtT/CqV/+W/8Z07UFtHQg83ryXN8kkueDGBsFnzZuZnZtfWFyqL6+srq1vbG7dmqzQDFssE5m+T8Cg4ApblluB97lGkInAu6R/PurfPaI2PFM3dpBjLCFVvMcZWEddP3cOOht+0AiqopMgHAOfjOuys1n7iLoZKyQqywQY0w6D3MYlaMuZwGE9KgzmwPqQYttBBRJNXFazDumeY7q0l2l3lKUV+9NRgjRmIBOnlGAfzN/eiJzWaxe2dxSXXOWFRcW+g3qFoDajo4fTLtfIrBg4AExzNytlD6CBWfc99UjhE8ukBNUtoz7aYTuMywiVKTSOssoXP4w0qNQ9cPhbnWiYUEeikvrhyxR1dX1zqoE6h9+k/yTBY/pfkjP+cLmdhn83OAlazcZxI7g68M9Ox8tdJDtkl+yTkBySM3JBLkmLMJKSV/JG3r0TD7zU499Srzb2bJNf5ekvbNfcWA==</latexit><latexit sha1_base64="0hCzLPDj61quhD03PTcBJHYacqQ=">AAACuHicdVFNSwMxEE3X7/qtRy/BRfBUdougIojgxaOiVaG7lNl0uoYm2SXJqmXtT/CqV/+W/8Z07UFtHQg83ryXN8kkueDGBsFnzZuZnZtfWFyqL6+srq1vbG7dmqzQDFssE5m+T8Cg4ApblluB97lGkInAu6R/PurfPaI2PFM3dpBjLCFVvMcZWEddP3cOOht+0AiqopMgHAOfjOuys1n7iLoZKyQqywQY0w6D3MYlaMuZwGE9KgzmwPqQYttBBRJNXFazDumeY7q0l2l3lKUV+9NRgjRmIBOnlGAfzN/eiJzWaxe2dxSXXOWFRcW+g3qFoDajo4fTLtfIrBg4AExzNytlD6CBWfc99UjhE8ukBNUtoz7aYTuMywiVKTSOssoXP4w0qNQ9cPhbnWiYUEeikvrhyxR1dX1zqoE6h9+k/yTBY/pfkjP+cLmdhn83OAlazcZxI7g68M9Ox8tdJDtkl+yTkBySM3JBLkmLMJKSV/JG3r0TD7zU499Srzb2bJNf5ekvbNfcWA==</latexit><latexit sha1_base64="0hCzLPDj61quhD03PTcBJHYacqQ=">AAACuHicdVFNSwMxEE3X7/qtRy/BRfBUdougIojgxaOiVaG7lNl0uoYm2SXJqmXtT/CqV/+W/8Z07UFtHQg83ryXN8kkueDGBsFnzZuZnZtfWFyqL6+srq1vbG7dmqzQDFssE5m+T8Cg4ApblluB97lGkInAu6R/PurfPaI2PFM3dpBjLCFVvMcZWEddP3cOOht+0AiqopMgHAOfjOuys1n7iLoZKyQqywQY0w6D3MYlaMuZwGE9KgzmwPqQYttBBRJNXFazDumeY7q0l2l3lKUV+9NRgjRmIBOnlGAfzN/eiJzWaxe2dxSXXOWFRcW+g3qFoDajo4fTLtfIrBg4AExzNytlD6CBWfc99UjhE8ukBNUtoz7aYTuMywiVKTSOssoXP4w0qNQ9cPhbnWiYUEeikvrhyxR1dX1zqoE6h9+k/yTBY/pfkjP+cLmdhn83OAlazcZxI7g68M9Ox8tdJDtkl+yTkBySM3JBLkmLMJKSV/JG3r0TD7zU499Srzb2bJNf5ekvbNfcWA==</latexit>

x5<latexit sha1_base64="A6I0K65uRn1/5YzQiT3xf7CHcEI=">AAACuHicdVFNTxsxEHW2LdDQQmiPXKyuKnGKdqMiQEgoEpceQTQNUnYVzTqTxYrtXdleINrkJ3CFa/9W/w3OkgP5GsnS05v3/MaeJBfc2CD4X/M+fPy0tb3zub775evefuPg21+TFZphh2Ui07cJGBRcYcdyK/A21wgyEdhNRpezfvceteGZ+mPHOcYSUsWHnIF11M1j/7jf8INmUBVdBeEc+GReV/2D2r9okLFCorJMgDG9MMhtXIK2nAmc1qPCYA5sBCn2HFQg0cRlNeuU/nTMgA4z7Y6ytGLfO0qQxoxl4pQS7J1Z7s3Idb1eYYencclVXlhU7C1oWAhqMzp7OB1wjcyKsQPANHezUnYHGph131OPFD6wTEpQgzIaoZ32wriMUJlC4yyrnPhhpEGl7oHTRXWiYUUdiUrqh5M16ur61loDdQ6/RTckwX26KckZ37ncTsPlDa6CTqt51gyuf/nti/lyd8gh+UGOSEhOSJv8JlekQxhJyRN5Ji/euQde6vE3qVebe76ThfL0K28p3Fk=</latexit><latexit sha1_base64="A6I0K65uRn1/5YzQiT3xf7CHcEI=">AAACuHicdVFNTxsxEHW2LdDQQmiPXKyuKnGKdqMiQEgoEpceQTQNUnYVzTqTxYrtXdleINrkJ3CFa/9W/w3OkgP5GsnS05v3/MaeJBfc2CD4X/M+fPy0tb3zub775evefuPg21+TFZphh2Ui07cJGBRcYcdyK/A21wgyEdhNRpezfvceteGZ+mPHOcYSUsWHnIF11M1j/7jf8INmUBVdBeEc+GReV/2D2r9okLFCorJMgDG9MMhtXIK2nAmc1qPCYA5sBCn2HFQg0cRlNeuU/nTMgA4z7Y6ytGLfO0qQxoxl4pQS7J1Z7s3Idb1eYYencclVXlhU7C1oWAhqMzp7OB1wjcyKsQPANHezUnYHGph131OPFD6wTEpQgzIaoZ32wriMUJlC4yyrnPhhpEGl7oHTRXWiYUUdiUrqh5M16ur61loDdQ6/RTckwX26KckZ37ncTsPlDa6CTqt51gyuf/nti/lyd8gh+UGOSEhOSJv8JlekQxhJyRN5Ji/euQde6vE3qVebe76ThfL0K28p3Fk=</latexit><latexit sha1_base64="A6I0K65uRn1/5YzQiT3xf7CHcEI=">AAACuHicdVFNTxsxEHW2LdDQQmiPXKyuKnGKdqMiQEgoEpceQTQNUnYVzTqTxYrtXdleINrkJ3CFa/9W/w3OkgP5GsnS05v3/MaeJBfc2CD4X/M+fPy0tb3zub775evefuPg21+TFZphh2Ui07cJGBRcYcdyK/A21wgyEdhNRpezfvceteGZ+mPHOcYSUsWHnIF11M1j/7jf8INmUBVdBeEc+GReV/2D2r9okLFCorJMgDG9MMhtXIK2nAmc1qPCYA5sBCn2HFQg0cRlNeuU/nTMgA4z7Y6ytGLfO0qQxoxl4pQS7J1Z7s3Idb1eYYencclVXlhU7C1oWAhqMzp7OB1wjcyKsQPANHezUnYHGph131OPFD6wTEpQgzIaoZ32wriMUJlC4yyrnPhhpEGl7oHTRXWiYUUdiUrqh5M16ur61loDdQ6/RTckwX26KckZ37ncTsPlDa6CTqt51gyuf/nti/lyd8gh+UGOSEhOSJv8JlekQxhJyRN5Ji/euQde6vE3qVebe76ThfL0K28p3Fk=</latexit>

x6<latexit sha1_base64="KfwcWg5SM1D4GHDfUYsRS3ymOFw=">AAACuHicdVFNTxsxEHW2LdDQQmiPXKyuKnGKdqOKDyGhSFx6BNE0SNlVNOtMFiu2d2V7gWiTn8AVrv1b/Tc4Sw7kayRLT2/e8xt7klxwY4Pgf8378PHT1vbO5/rul697+42Db39NVmiGHZaJTN8mYFBwhR3LrcDbXCPIRGA3GV3O+t171IZn6o8d5xhLSBUfcgbWUTeP/eN+ww+aQVV0FYRz4JN5XfUPav+iQcYKicoyAcb0wiC3cQnaciZwWo8KgzmwEaTYc1CBRBOX1axT+tMxAzrMtDvK0op97yhBGjOWiVNKsHdmuTcj1/V6hR2exiVXeWFRsbegYSGozejs4XTANTIrxg4A09zNStkdaGDWfU89UvjAMilBDcpohHbaC+MyQmUKjbOscuKHkQaVugdOF9WJhhV1JCqpH07WqKvrW2sN1Dn8Ft2QBPfppiRnfOdyOw2XN7gKOq3mWTO4/uW3L+bL3SGH5Ac5IiE5IW3ym1yRDmEkJU/kmbx45x54qcffpF5t7vlOFsrTr3F73Fo=</latexit><latexit sha1_base64="KfwcWg5SM1D4GHDfUYsRS3ymOFw=">AAACuHicdVFNTxsxEHW2LdDQQmiPXKyuKnGKdqOKDyGhSFx6BNE0SNlVNOtMFiu2d2V7gWiTn8AVrv1b/Tc4Sw7kayRLT2/e8xt7klxwY4Pgf8378PHT1vbO5/rul697+42Db39NVmiGHZaJTN8mYFBwhR3LrcDbXCPIRGA3GV3O+t171IZn6o8d5xhLSBUfcgbWUTeP/eN+ww+aQVV0FYRz4JN5XfUPav+iQcYKicoyAcb0wiC3cQnaciZwWo8KgzmwEaTYc1CBRBOX1axT+tMxAzrMtDvK0op97yhBGjOWiVNKsHdmuTcj1/V6hR2exiVXeWFRsbegYSGozejs4XTANTIrxg4A09zNStkdaGDWfU89UvjAMilBDcpohHbaC+MyQmUKjbOscuKHkQaVugdOF9WJhhV1JCqpH07WqKvrW2sN1Dn8Ft2QBPfppiRnfOdyOw2XN7gKOq3mWTO4/uW3L+bL3SGH5Ac5IiE5IW3ym1yRDmEkJU/kmbx45x54qcffpF5t7vlOFsrTr3F73Fo=</latexit><latexit sha1_base64="KfwcWg5SM1D4GHDfUYsRS3ymOFw=">AAACuHicdVFNTxsxEHW2LdDQQmiPXKyuKnGKdqOKDyGhSFx6BNE0SNlVNOtMFiu2d2V7gWiTn8AVrv1b/Tc4Sw7kayRLT2/e8xt7klxwY4Pgf8378PHT1vbO5/rul697+42Db39NVmiGHZaJTN8mYFBwhR3LrcDbXCPIRGA3GV3O+t171IZn6o8d5xhLSBUfcgbWUTeP/eN+ww+aQVV0FYRz4JN5XfUPav+iQcYKicoyAcb0wiC3cQnaciZwWo8KgzmwEaTYc1CBRBOX1axT+tMxAzrMtDvK0op97yhBGjOWiVNKsHdmuTcj1/V6hR2exiVXeWFRsbegYSGozejs4XTANTIrxg4A09zNStkdaGDWfU89UvjAMilBDcpohHbaC+MyQmUKjbOscuKHkQaVugdOF9WJhhV1JCqpH07WqKvrW2sN1Dn8Ft2QBPfppiRnfOdyOw2XN7gKOq3mWTO4/uW3L+bL3SGH5Ac5IiE5IW3ym1yRDmEkJU/kmbx45x54qcffpF5t7vlOFsrTr3F73Fo=</latexit>

Zhao-Yu Han, Jun Wang, Heng Fan, Lei Wang, Pan Zhang, PRX 8, 031012 (2018)

p(x) =<latexit sha1_base64="OtSLSwa9oF52bfz3eGobgOif5fw=">AAACx3icdVFNT9tAEN24fDUFmpRjLyusSnCJ7Aip7QEJiQvqiUpNQYqtaLwZh1V219buOhAZH/gfHLjQ/8S/6drkACSMtNLTm/f2ze4kueDGBsFTy/uwtr6xufWx/Wl7Z/dzp/vlr8kKzXDAMpHpywQMCq5wYLkVeJlrBJkIvEimp3X/Yoba8Ez9sfMcYwkTxVPOwDpq1OnkB5EEe5Wk5U11SI/pqOMHvaApugzCBfDJos5H3dZDNM5YIVFZJsCYYRjkNi5BW84EVu2oMJgDm8IEhw4qkGjishm9ot8cM6Zppt1RljbsS0cJ0pi5TJyyHtO87dXkqt6wsOmPuOQqLywq9hyUFoLajNb/QMdcI7Ni7gAwzd2slF2BBmbdb7UjhdcskxLUuIymaKthGJcRKlNorLPKWz+MNKiJe2D1Wp1oWFJHopH64e0KdXN9f6WBOoffp+8kwWzyXpIzvnC5nYZvN7gMBv3ez17w+8g/OV0sd4t8JfvkgITkOzkhZ+ScDAgjM3JPHsk/75eXezPv5lnqtRaePfKqvLv/ZIHhgQ==</latexit><latexit sha1_base64="OtSLSwa9oF52bfz3eGobgOif5fw=">AAACx3icdVFNT9tAEN24fDUFmpRjLyusSnCJ7Aip7QEJiQvqiUpNQYqtaLwZh1V219buOhAZH/gfHLjQ/8S/6drkACSMtNLTm/f2ze4kueDGBsFTy/uwtr6xufWx/Wl7Z/dzp/vlr8kKzXDAMpHpywQMCq5wYLkVeJlrBJkIvEimp3X/Yoba8Ez9sfMcYwkTxVPOwDpq1OnkB5EEe5Wk5U11SI/pqOMHvaApugzCBfDJos5H3dZDNM5YIVFZJsCYYRjkNi5BW84EVu2oMJgDm8IEhw4qkGjishm9ot8cM6Zppt1RljbsS0cJ0pi5TJyyHtO87dXkqt6wsOmPuOQqLywq9hyUFoLajNb/QMdcI7Ni7gAwzd2slF2BBmbdb7UjhdcskxLUuIymaKthGJcRKlNorLPKWz+MNKiJe2D1Wp1oWFJHopH64e0KdXN9f6WBOoffp+8kwWzyXpIzvnC5nYZvN7gMBv3ez17w+8g/OV0sd4t8JfvkgITkOzkhZ+ScDAgjM3JPHsk/75eXezPv5lnqtRaePfKqvLv/ZIHhgQ==</latexit><latexit sha1_base64="OtSLSwa9oF52bfz3eGobgOif5fw=">AAACx3icdVFNT9tAEN24fDUFmpRjLyusSnCJ7Aip7QEJiQvqiUpNQYqtaLwZh1V219buOhAZH/gfHLjQ/8S/6drkACSMtNLTm/f2ze4kueDGBsFTy/uwtr6xufWx/Wl7Z/dzp/vlr8kKzXDAMpHpywQMCq5wYLkVeJlrBJkIvEimp3X/Yoba8Ez9sfMcYwkTxVPOwDpq1OnkB5EEe5Wk5U11SI/pqOMHvaApugzCBfDJos5H3dZDNM5YIVFZJsCYYRjkNi5BW84EVu2oMJgDm8IEhw4qkGjishm9ot8cM6Zppt1RljbsS0cJ0pi5TJyyHtO87dXkqt6wsOmPuOQqLywq9hyUFoLajNb/QMdcI7Ni7gAwzd2slF2BBmbdb7UjhdcskxLUuIymaKthGJcRKlNorLPKWz+MNKiJe2D1Wp1oWFJHopH64e0KdXN9f6WBOoffp+8kwWzyXpIzvnC5nYZvN7gMBv3ez17w+8g/OV0sd4t8JfvkgITkOzkhZ+ScDAgjM3JPHsk/75eXezPv5lnqtRaePfKqvLv/ZIHhgQ==</latexit>

2<latexit sha1_base64="a611A2aYkSyYekr8vJuYwEL0Bmw=">AAACtnicdVFNSwMxEE3X7/qtRy/BRfBUdoug3gQvHhWsFrpLmU2na2iSXZKsUrb9BV717t/y35hue1DbDgQeb97Lm2SSXHBjg+C75q2srq1vbG7Vt3d29/YPDo+eTFZohi2WiUy3EzAouMKW5VZgO9cIMhH4nAxuJ/3nV9SGZ+rRDnOMJaSK9zkD66iHZvfADxpBVXQehDPgk1nddw9rX1EvY4VEZZkAYzphkNu4BG05EziuR4XBHNgAUuw4qECiictq0jE9c0yP9jPtjrK0Yn87SpDGDGXilBLsi/nfm5CLep3C9q/ikqu8sKjYNKhfCGozOnk27XGNzIqhA8A0d7NS9gIamHWfU48UvrFMSlC9MhqgHXfCuIxQmULjJKsc+WGkQaXugeO/6kTDnDoSldQPRwvU1fXNhQbqHH6TLkmC13RZkjP+crmdhv83OA9azcZ1I3i48G9uZ8vdJCfklJyTkFySG3JH7kmLMILknXyQT+/a63ropVOpV5t5jsmf8vIfKgPbcA==</latexit><latexit sha1_base64="a611A2aYkSyYekr8vJuYwEL0Bmw=">AAACtnicdVFNSwMxEE3X7/qtRy/BRfBUdoug3gQvHhWsFrpLmU2na2iSXZKsUrb9BV717t/y35hue1DbDgQeb97Lm2SSXHBjg+C75q2srq1vbG7Vt3d29/YPDo+eTFZohi2WiUy3EzAouMKW5VZgO9cIMhH4nAxuJ/3nV9SGZ+rRDnOMJaSK9zkD66iHZvfADxpBVXQehDPgk1nddw9rX1EvY4VEZZkAYzphkNu4BG05EziuR4XBHNgAUuw4qECiictq0jE9c0yP9jPtjrK0Yn87SpDGDGXilBLsi/nfm5CLep3C9q/ikqu8sKjYNKhfCGozOnk27XGNzIqhA8A0d7NS9gIamHWfU48UvrFMSlC9MhqgHXfCuIxQmULjJKsc+WGkQaXugeO/6kTDnDoSldQPRwvU1fXNhQbqHH6TLkmC13RZkjP+crmdhv83OA9azcZ1I3i48G9uZ8vdJCfklJyTkFySG3JH7kmLMILknXyQT+/a63ropVOpV5t5jsmf8vIfKgPbcA==</latexit><latexit sha1_base64="a611A2aYkSyYekr8vJuYwEL0Bmw=">AAACtnicdVFNSwMxEE3X7/qtRy/BRfBUdoug3gQvHhWsFrpLmU2na2iSXZKsUrb9BV717t/y35hue1DbDgQeb97Lm2SSXHBjg+C75q2srq1vbG7Vt3d29/YPDo+eTFZohi2WiUy3EzAouMKW5VZgO9cIMhH4nAxuJ/3nV9SGZ+rRDnOMJaSK9zkD66iHZvfADxpBVXQehDPgk1nddw9rX1EvY4VEZZkAYzphkNu4BG05EziuR4XBHNgAUuw4qECiictq0jE9c0yP9jPtjrK0Yn87SpDGDGXilBLsi/nfm5CLep3C9q/ikqu8sKjYNKhfCGozOnk27XGNzIqhA8A0d7NS9gIamHWfU48UvrFMSlC9MhqgHXfCuIxQmULjJKsc+WGkQaXugeO/6kTDnDoSldQPRwvU1fXNhQbqHH6TLkmC13RZkjP+crmdhv83OA9azcZ1I3i48G9uZ8vdJCfklJyTkFySG3JH7kmLMILknXyQT+/a63ropVOpV5t5jsmf8vIfKgPbcA==</latexit>

7

Clearly this is an advantage of our model over the Hopfieldmodel and inverse Ising model, whose maximum model ca-pacity is proportional to system size.

Careful readers may complain that the inverse Ising modelmay be not the correct model to compare with, and the inverseIsing models with hidden variables, i.e. Boltzmann machines,do have infinite representation power. Actually this is exactlyour point of using MPS as a unsupervised model: increas-ing bond-dimensions of tensors in MPS has similar functionsas increasing number of hidden variables in other generativemodels.

In Fig. 3(b) we plotL as a function of system size N, trainedon |T | = 100 random patterns. From the figure we can seethat with Dmax fixed L increases linearly with system size N,which indicates that our model gives worse capability with alarger system size. This is due to the fact that keeping joint-distribution of variables becomes more and more di�cult forMPS when the number of variables increases. A simple ex-ample is that the correlation between two arbitrary variabledecays quickly as distance between two variables increases.This is actually a drawback of our model when comparedwith pair-wise models such as the inverse Ising model, whichis the maximum-entropy model given mean and variance ofthe probability distribution of training samples, hence is ableto capture long-distance correlations of the training data eas-ily. Fortunately Fig. 3 (b) also shows that the decay of ca-pability with system size can be compensated by increasingDmax, which reflects the universality approximation capabil-ity of MPS without any constraint onDmax [15].

C. MNIST data set

In this subsection we perform generative modeling experi-ment on the MNIST dataset [49]. Since MNIST images are ingrayscale, we first turn them into binary numbers by randombinarization, then flatten the images row by row into a vectoras what was done for the BS dataset in Sec. III A. For the pur-pose of unsupervised generative modeling we do not need touse the labels of the digits.

Firstly we randomly selected |T | = 100 samples from theMNIST dataset and trained the MPS with di↵erent maximalbond dimensions Dmax. The results are shown in Fig. 4. Theimages in the insets are random generations by direct sam-pling from the MPS at di↵erent NLL values (denoted by thearrows). It is shown that with Dmax increasing, L convergesgradually to its minimum ln(|T |), and generations becomemore and more clear. It is interesting to see that with a rel-atively small maximum bond dimension, e.g. Dmax = 100,although the handwriting is not as clear as with a large bonddimension, we can see that some crucial features, such asloops in image with label “6”, and the corner in image withlabel “7”, have already been reconstructed. When the max-imum bond dimension increases to a larger value, the MPSis able to learn clear character of an image such as the penstrokes, hence we can say that the MPS has learned many“prototypes”. It has recently been reported that the feature-to-prototype transition in pattern recognitions can be observed

50 100 150 200 250 300 3500

10

20

30

40

50

60

70

Dmax

L

ln(|T | )

FIG. 4: The mean negative log-likelihood of a MPS trainedusing |T | = 100 MNIST images of size 28 ⇥ 28, with varying

maximum bond dimensionDmax. In all experiments,parameters of learning are set to cutoff = 1 ⇥ 10�6,

nb = 1, ⌘ = 0.001. Each MPS is trained over at most 200loops, the training is terminated if di↵erence of L betweenconsecutive loops is less than 0.002. The horizontal dashedline indicates the Shannon entropy of the training dataset

ln |T |, which is also the minimum value of L on the trainingdata.

by using a many-body interaction in the Hopfield model, orequivalently using an higher-order rectified polynomial acti-vation function in the deep neural networks [50]. It is inter-esting to remark that in our model this can be achieved bysimply adjusting the maximum bond dimension of the MPS.

Secondly we randomly selected 1000 images from theMNIST database for training. We restricted the maximumbond dimension to Dmax = 800. The first loop was carriedout under the hyperparameter configuration of: cutoff = 0.3,⌘ = 0.05, nb = 10, and the following 150 loops came with:cutoff = 1⇥10�7, ⌘ = 1⇥10�3, nb = 20. On each bond we per-formed 10 steps of stochastic gradient descent, c.f. Eq. (11).After 251 loops of training, the NLL on the training datasetreached 16.8, and many bond dimensions had reached Dmax.Figure 5 shows the distribution of bond dimensions where wecan see that large bond dimensions concentrated in the centerof the image, where the variation of the pixels are the largest.The bond dimensions around the top and bottom edge of theimage remained small, because those pixels are always inacti-vated in the training samples. They carry no information andhas no correlations with the remaining part of the image. In-terestingly, although the pixels on the left and right edges arealso white, they have larger bond dimensions because thesebonds learned to mediate the correlations between the rows ofthe images.

The samples directly generated after training are shown inFig. 6(a). We also show a few original samples from the train-ing set in Fig. 6(b) for comparison. Although many of thegenerated images cannot be recognized as digits, some as-

Negative Log-Likelihood

8

FIG. 5: Bond dimensions of the learnt MPS. Each pixel inthis figure corresponds to bond dimension of the right leg of

the tensor associated to the identical coordinate in theoriginal image.

pects of the result are worth mentioning. Firstly, the MPSlearnt to leave margin blank with width 4 pixels, which is themost obvious common feature in MNIST database. Secondly,the activated pixels compose kinds of pen strokes that can beextracted from the digits. Finally, a few of the samples al-ready be recognized as digits. We expect as one increasesthe maximal bond dimension and keeps on training the MPSwill produce more realistic images. Unlike the discrimina-tive learning task carried out in [30], it seems we need to usemuch larger bond dimensions to achieve a good performancein the unsupervised learning. We think the reason is that inthe classification task, local features of an image are used forpredicting label of the image. Those local features, such aspresence of loop for label “6”, usually span a shorter rangeof pixes, thus do not require MPS to remember longer-rangecorrelation between pixels. However it is necessary for gener-ative modeling because learning the joint distribution from the

(a) Generated (b) Original

FIG. 6: (a) Images generated from a MPS trained on the1000-image training set over 251 loops, achieving a finalaverage NLL 16.8. (b) Original images randomly selected

from the training set.

(a) column reconstruction ontraining images

(b) row reconstruction on trainingimages

(c) column reconstruction ontesting images

(d) row reconstruction on testingimages

FIG. 7: Image reconstruction from partial images by directsampling of a MPS that has been trained on the 1000-image

training set over 251 loops. (a,b) Restoration of images in thetraining set, e.g. those shown in Fig. 6(b). (c,d)

Reconstruction of 16 images chosen from the test set. Thetest set contains images from the MNIST database that were

not used for training. The given parts are in black and thereconstructed parts are in gray. The reconstructed parts are:(a,c) 12 columns from either the left or the right; (b,d) 12

rows from either the top or the bottom.

data consists of (but not limited to) learning two-point corre-lations between pairs of variables that could be far from eachother.

Thirdly we carried out image restoration experiment on theMPS trained on 1000 images. As shown in Fig. 7 we firstremove part of images from the samples in Fig. 6 (b) then re-construct ungiven pixels (in gray) using direct sampling con-ditioning on given parts. For column reconstruction, its per-formance is remarkable. The reconstructed images in Fig. 7(a)are almost identical with the original ones in Fig. 6(b). On theother hand, for row reconstruction in Fig. 7(b), it made inter-esting but reasonable deviations. For instance, the rightmoston the first row, an “1” has been bent to “7”. We also checkedits ability to reconstruct MNIST images other than the trainingimages, as shown in Fig. 7(c,d). These indicates that the MPShas learned crucial features of the dataset, rather than merelymemorizing the training instances. In fact, even as early aswhen trained over only 11 loops, the MPS could perform col-umn reconstruction with similar image quality, but its row re-construction performance was much worse than later when

Reconstructing Testing Images

Page 97: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Supervised Learning with Generalized Tensor NetworksI. Glasser, N. Pancotti, J.I. Cirac, arxiv:1806.05964

string-bond states

• Models where inputs are copied, then processed by multiple tensor networks • Hybrid CNN / string bond architecture gives 92.3% on fashion MNIST test set!

entangled plaquette states

4

FIG. 3. Copy operation of a vector input Ai, resulting in anew tensor Bij = AiAj , or of a tensor network.

(a) RBM (b) SBS

(c) Short-range RBM (d) EPS

FIG. 4. (a) Restricted Boltzmann Machine (RBM) consistingof visible and hidden variables (b) String-Bond State with1D geometry generalizing RBM. The legs corresponding tocontracted indices in each MPS are depicted in orange forvisibility. (c) Short-range RBM with local connections be-tween visible and hidden variables (d) Entangled PlaquetteState (EPS) generalizing the short-range RBM.

(Fig. 3). More generally one can apply this copy opera-tion to a larger tensor network, which is then copied. Inpractice one would first contract the incoming network,resulting in a vector. This vector is then copied and sentto the remaining edges. The rest of the tensor networkcan then be independently contracted. We impose thatthere are no directed loops containing dots and ensurethat the tensor network can be e�ciently contracted aslong as it is contracted in the right order.

In general, it is not possible to write this dot as atensor[43]. In particular cases, when all the inputs arediscrete and in a fixed basis, it is possible to write thecopy operation as a COPY-dot tensor, as introduced inRef.[53]. For discrete inputs and when the copy oper-ation only applies to the inputs, the generalized tensornetwork can therefore be written as a tensor network in-cluding COPY-dot tensors. In the more general case,while it is not possible to represent the copy operation asa tensor, duplicating a vector and sending it to two di↵er-ent parts of the calculation is easily achieved in practice.The generalized tensor network can in this case be viewedas a larger tensor network containing several copies of thesame tensors (weight sharing) and with inputs which arecopied several times.

Examples of such tensor networks with copy of the in-put states have been used in the quantum physics com-munity. The simplest example are Entangled PlaquetteStates (EPS)[40–42], also known as Correlator ProductStates, in which the tensor network is defined as a prod-uct of smaller tensors on overlapping clusters of variables:

Tx1,...,xN =PY

p=1

Txpp , (11)

where a coe�cient Txpp is assigned to each of the 2np

(for binary variables) configurations xp of the variablesin cluster p. Because the clusters overlap, the value ofeach variable is copied to all the tensors in which it isincluded. A sparse or short-range RBM is a special caseof EPS[31] (eventually non-local for a sparse but non-local RBM) in which the tensor T

xpp takes the particular

form (1+eP

j wpjxj ), where the sum is limited to the vari-ables in each cluster. EPS also share some similarity withConvolutional Neural Networks (CNN): a convolutionallayer with discrete inputs is a particular case of EPS withweight sharing between the tensors. The EPS realizes allpossible convolutional filters over a discrete input space,instead of selecting a small number of filters as CNN do.Another example are String-Bond States (SBS)[37, 38],

defined by placing Matrix Product States over strings(each string s is an ordered subset of the set of variables)on a graph which need not be a one-dimensional lattice.The resulting tensor network is

Tx1,...,xN =Y

s

Tr

0

@Y

j2s

Axj

s,j

1

A . (12)

The value of each visible variable is copied and sent todi↵erent MPS. A RBM is a special case of SBS[31] forwhich each string is associated with a hidden variable andcovers all visible variables, and the matrices are taken tobe

Axj

s,j =

✓1 00 ewsjxj

◆. (13)

SBS thus provide a generalization of RBM that is natu-rally defined for discrete variables of any dimension andcan introduce di↵erent correlations through the use ofhigher dimensional and non-commuting matrices. SinceSBS also include a MPS as a particular case, they providea way to interpolate between a MPS (large bond dimen-sion, only one string) and a RBM (bond dimension 2,diagonal matrices, many strings). Di↵erent choices ofstring geometries can be used and the geometry may bedefined depending on the problem. For example in 2 di-mensions (which is more suitable to images), one mayplace only short horizontal and vertical strings coveringthe 2D lattice (Fig. 5a). We will denote this kind of SBSas 2D-SBS. Correlations along one of the two dimensionscan be captured in the corresponding MPS, and morecomplex correlations are included through the overlap of

4

FIG. 3. Copy operation of a vector input Ai, resulting in anew tensor Bij = AiAj , or of a tensor network.

(a) RBM (b) SBS

(c) Short-range RBM (d) EPS

FIG. 4. (a) Restricted Boltzmann Machine (RBM) consistingof visible and hidden variables (b) String-Bond State with1D geometry generalizing RBM. The legs corresponding tocontracted indices in each MPS are depicted in orange forvisibility. (c) Short-range RBM with local connections be-tween visible and hidden variables (d) Entangled PlaquetteState (EPS) generalizing the short-range RBM.

(Fig. 3). More generally one can apply this copy opera-tion to a larger tensor network, which is then copied. Inpractice one would first contract the incoming network,resulting in a vector. This vector is then copied and sentto the remaining edges. The rest of the tensor networkcan then be independently contracted. We impose thatthere are no directed loops containing dots and ensurethat the tensor network can be e�ciently contracted aslong as it is contracted in the right order.

In general, it is not possible to write this dot as atensor[43]. In particular cases, when all the inputs arediscrete and in a fixed basis, it is possible to write thecopy operation as a COPY-dot tensor, as introduced inRef.[53]. For discrete inputs and when the copy oper-ation only applies to the inputs, the generalized tensornetwork can therefore be written as a tensor network in-cluding COPY-dot tensors. In the more general case,while it is not possible to represent the copy operation asa tensor, duplicating a vector and sending it to two di↵er-ent parts of the calculation is easily achieved in practice.The generalized tensor network can in this case be viewedas a larger tensor network containing several copies of thesame tensors (weight sharing) and with inputs which arecopied several times.

Examples of such tensor networks with copy of the in-put states have been used in the quantum physics com-munity. The simplest example are Entangled PlaquetteStates (EPS)[40–42], also known as Correlator ProductStates, in which the tensor network is defined as a prod-uct of smaller tensors on overlapping clusters of variables:

Tx1,...,xN =PY

p=1

Txpp , (11)

where a coe�cient Txpp is assigned to each of the 2np

(for binary variables) configurations xp of the variablesin cluster p. Because the clusters overlap, the value ofeach variable is copied to all the tensors in which it isincluded. A sparse or short-range RBM is a special caseof EPS[31] (eventually non-local for a sparse but non-local RBM) in which the tensor T

xpp takes the particular

form (1+eP

j wpjxj ), where the sum is limited to the vari-ables in each cluster. EPS also share some similarity withConvolutional Neural Networks (CNN): a convolutionallayer with discrete inputs is a particular case of EPS withweight sharing between the tensors. The EPS realizes allpossible convolutional filters over a discrete input space,instead of selecting a small number of filters as CNN do.Another example are String-Bond States (SBS)[37, 38],

defined by placing Matrix Product States over strings(each string s is an ordered subset of the set of variables)on a graph which need not be a one-dimensional lattice.The resulting tensor network is

Tx1,...,xN =Y

s

Tr

0

@Y

j2s

Axj

s,j

1

A . (12)

The value of each visible variable is copied and sent todi↵erent MPS. A RBM is a special case of SBS[31] forwhich each string is associated with a hidden variable andcovers all visible variables, and the matrices are taken tobe

Axj

s,j =

✓1 00 ewsjxj

◆. (13)

SBS thus provide a generalization of RBM that is natu-rally defined for discrete variables of any dimension andcan introduce di↵erent correlations through the use ofhigher dimensional and non-commuting matrices. SinceSBS also include a MPS as a particular case, they providea way to interpolate between a MPS (large bond dimen-sion, only one string) and a RBM (bond dimension 2,diagonal matrices, many strings). Di↵erent choices ofstring geometries can be used and the geometry may bedefined depending on the problem. For example in 2 di-mensions (which is more suitable to images), one mayplace only short horizontal and vertical strings coveringthe 2D lattice (Fig. 5a). We will denote this kind of SBSas 2D-SBS. Correlations along one of the two dimensionscan be captured in the corresponding MPS, and morecomplex correlations are included through the overlap of

9

(a) (b)

FIG. 9. (a) Real inputs Xi are mapped to a feature vector(here with length two). This vector can then be used as in-put to a generalized tensor network by contracting it with theopen legs of the generalized tensor network. (b) Feature ten-sors can compress the discretized representation of the inputsXi to a smaller dimensional space. These tensors can shareweights and can be learned as part of the tensor network.

(a) (b)

FIG. 10. (a) Dataset with two features X1 and X2 and twoclasses (depicted in di↵erent colors) that cannot be learnedby a MPS of bond dimension 2 with features in Eq. (26). (b)Two normalized features learned by a tensor while classify-ing the previous dataset with a MPS of bond dimension 2.The features have been discretized in 16 intervals. Using thischoice of features the MPS can classify the dataset perfectly.

functions and learning them at the same time as the restof the network. To be able to use a purely tensor net-work algorithm, we can parametrize these functions usinga tensor network. In the simplest case, we discretize thereal data and use a tensor to compress the large dimen-sional input into a smaller dimensional vector of suitablelength. This tensor can be learned as part of the wholetensor network and prevents the size of the rest of thetensor network to increase when the discretization sizechanges. The feature tensor can be the same for all vari-ables, for example image pixels, but can be di↵erent inthe case where the variables are of di↵erent nature. Us-ing this procedure, a MPS of bond dimension 2 is able toget perfect accuracy on the dataset presented in Fig. 10a.The two features that the network has learned are pre-sented in Fig. 10b. We note that starting from randomfeatures on more complex datasets makes learning dif-ficult, but the feature tensor can be pretrained using alinear classifier, before being trained with the rest of thenetwork.

In comparison, we also show in Fig. 11b the featureslearned while classifying MNIST with greyscale pixelsand a snake-SBS (see sectionV). These features are notvery di↵erent from the choice in Eq. 26 (Fig. 11a), andwe could not distinguish performance with this choice or

(a) (b)

FIG. 11. (a) Choice of two features in Eq. (26) for an in-put taking real values between 0 and 1. (b) Two normalizedfeatures learned by a tensor with output dimension 2 com-bined with a snake-SBS classifying the MNIST dataset. Theinput features x are the greyscale value of pixels, normalizedbetween 0 and 1 and discretized in 16 intervals.

with learned features on this dataset. We expect howeverthat this procedure will be necessary for more complexdatasets which are not easily approximated by a binaryfunction. Moreover the size of the feature vector providesa regularization of the model, and higher sizes might benecessary for more complex datasets. More generally thistensor could be itself represented with a small tensor net-work, to prevent the number of parameters to increasetoo much with a very small discretization interval. Itis interesting to note that the features learned in ourexamples are almost continuous even if we use smallerdiscretization intervals. This means that two real inputsthat are close to each other will lead to the same predic-tions by the network, a property which is in general nottrue if we simply discretize the inputs and use a largertensor network. Our approach of learning the featuresas part of the tensor network may be especially relevantin the context of quantum machine learning, where thetensor network is replaced by a quantum circuit and itmight be suitable to have the full network as part of thesame quantum machine learning architecture.

As an alternative way of choosing the features, we cancombine the feature choice with other machine learningtechniques. If the input data represents images, it is anatural choice to use Convolutional Neural Networks asfeature extractors, since these have been highly success-ful for image classification. CNN consist in convolutionfilters, which use convolutional kernels to transform animage into a set of filtered images, and pooling layerswhich downsize the images (Fig. 12). The di↵erent fil-ters can be seen as di↵erent features of the correspondingpixel or region of the image and preserve locality. There-fore it is natural to consider the vector of applied filtersassociated with each location in the image as a featurevector that can be used in conjunction with generalizedtensor networks. The CNN and the tensor network canbe trained together, since the derivatives of the tensornetwork can be used in the backpropagation algorithmwhich computes the gradient of the cost function.

9

(a) (b)

FIG. 9. (a) Real inputs Xi are mapped to a feature vector(here with length two). This vector can then be used as in-put to a generalized tensor network by contracting it with theopen legs of the generalized tensor network. (b) Feature ten-sors can compress the discretized representation of the inputsXi to a smaller dimensional space. These tensors can shareweights and can be learned as part of the tensor network.

(a) (b)

FIG. 10. (a) Dataset with two features X1 and X2 and twoclasses (depicted in di↵erent colors) that cannot be learnedby a MPS of bond dimension 2 with features in Eq. (26). (b)Two normalized features learned by a tensor while classify-ing the previous dataset with a MPS of bond dimension 2.The features have been discretized in 16 intervals. Using thischoice of features the MPS can classify the dataset perfectly.

functions and learning them at the same time as the restof the network. To be able to use a purely tensor net-work algorithm, we can parametrize these functions usinga tensor network. In the simplest case, we discretize thereal data and use a tensor to compress the large dimen-sional input into a smaller dimensional vector of suitablelength. This tensor can be learned as part of the wholetensor network and prevents the size of the rest of thetensor network to increase when the discretization sizechanges. The feature tensor can be the same for all vari-ables, for example image pixels, but can be di↵erent inthe case where the variables are of di↵erent nature. Us-ing this procedure, a MPS of bond dimension 2 is able toget perfect accuracy on the dataset presented in Fig. 10a.The two features that the network has learned are pre-sented in Fig. 10b. We note that starting from randomfeatures on more complex datasets makes learning dif-ficult, but the feature tensor can be pretrained using alinear classifier, before being trained with the rest of thenetwork.

In comparison, we also show in Fig. 11b the featureslearned while classifying MNIST with greyscale pixelsand a snake-SBS (see sectionV). These features are notvery di↵erent from the choice in Eq. 26 (Fig. 11a), andwe could not distinguish performance with this choice or

(a) (b)

FIG. 11. (a) Choice of two features in Eq. (26) for an in-put taking real values between 0 and 1. (b) Two normalizedfeatures learned by a tensor with output dimension 2 com-bined with a snake-SBS classifying the MNIST dataset. Theinput features x are the greyscale value of pixels, normalizedbetween 0 and 1 and discretized in 16 intervals.

with learned features on this dataset. We expect howeverthat this procedure will be necessary for more complexdatasets which are not easily approximated by a binaryfunction. Moreover the size of the feature vector providesa regularization of the model, and higher sizes might benecessary for more complex datasets. More generally thistensor could be itself represented with a small tensor net-work, to prevent the number of parameters to increasetoo much with a very small discretization interval. Itis interesting to note that the features learned in ourexamples are almost continuous even if we use smallerdiscretization intervals. This means that two real inputsthat are close to each other will lead to the same predic-tions by the network, a property which is in general nottrue if we simply discretize the inputs and use a largertensor network. Our approach of learning the featuresas part of the tensor network may be especially relevantin the context of quantum machine learning, where thetensor network is replaced by a quantum circuit and itmight be suitable to have the full network as part of thesame quantum machine learning architecture.

As an alternative way of choosing the features, we cancombine the feature choice with other machine learningtechniques. If the input data represents images, it is anatural choice to use Convolutional Neural Networks asfeature extractors, since these have been highly success-ful for image classification. CNN consist in convolutionfilters, which use convolutional kernels to transform animage into a set of filtered images, and pooling layerswhich downsize the images (Fig. 12). The di↵erent fil-ters can be seen as di↵erent features of the correspondingpixel or region of the image and preserve locality. There-fore it is natural to consider the vector of applied filtersassociated with each location in the image as a featurevector that can be used in conjunction with generalizedtensor networks. The CNN and the tensor network canbe trained together, since the derivatives of the tensornetwork can be used in the backpropagation algorithmwhich computes the gradient of the cost function.

learning local features:

fashion MNIST

Page 98: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Deep Learning and Quantum Entanglement...Yoav Levine, David Yakira, Nadav Cohen, Amnon Shashua, arxiv:1704.01552

• "ConvAC" deep neural net = tree tensor network • Tensor network rank as capacity of model • Experiment on "inductive bias" of model architecture

Tree Network as a Deep Neural Net

Inductive Bias Experiment

DEEP LEARNING AND QUANTUM ENTANGLEMENT

Figure 6: The components comprising a ‘ConvAC-weights TN’ � that describes the weights tensorA

y of a ConvAC, are an undirected graph G(V, E) and a bond dimensions function c.The bond dimension is specified next to each edge e 2 E, and is given by the functionc(e). As shown in sec. 6.2, the bond dimension of the edges in each layer of this TNis equal to the number of channels in the corresponding layer in the ConvAC. The nodeset in the graph G(V, E) presented above decomposes to V = V tn ·[ V inputs, where V tn

(grey) are vertices which correspond to tensors in the ConvAC TN and V inputs (blue) aredegree 1 vertices which correspond to the N open edges in the ConvAC TN. The verticesin V inputs are ‘virtual’ — were added for completeness, so G can be viewed as a legalgraph. The open edge emanating from the top-most tensor (marked by a dashed line) isomitted from the graph, as it does not effect our analysis below — no flow between anytwo input groups can pass through it.

graph describing its connectivity and a function that assigns a bond dimension to each edge. In thissection we describe the TN we constructed in sec. 6.2 in graph-theoretic terminology.

The ability to represent a deep convolutional network (ConvAC) as a ‘legal’ graph, is a keyaccomplishment that the Tensor Networks tool brings forth. Our main results rely on this graph-theoretic description and tie the expressiveness of a ConvAC to a minimal cut in the graph char-acterizing it, via the connection to quantum entanglement measures. This is in fact a utilization ofthe ‘Quantum min-cut max-flow’ concept presented by Cui et al. (2016). Essentially, the quantummax-flow between A and B is a measure of the ability of the TN to model correlations between Aand B, and the quantum min-cut is a quantity that bounds this ability and can be directly inferredfrom the graph defining it — that of the corresponding TN.

As noted in sec. 6.2, the TN that describes the calculation of a ConvAC, i.e. a network that re-ceives as inputs the M representation channels and outputs Y labels, is the result of an inner-productof two tensors A

yd1...dN

holding the weights of the network and A(rank 1)d1...dN

(x1, . . . ,xN ) which is arank-1 tensor holding the ‘input’ of N vectors v(0,j)

2 RM , j 2 [N ] composing the representationlayer. In this section we focus on the TN that describes Ay

d1...dN, which is the upper block of fig. 5

and is also reproduced as a stand-alone TN in fig. 6, referred to as the ‘ConvAC-weights TN’ and

21

LEVINE YAKIRA COHEN SHASHUA

Figure 10: (color available online) Results of applying deep convolutional rectifier networks withmax pooling to the global and local classification tasks. Two channel arrangements ge-ometries were evaluated — ‘wide-tip’, which supports modeling correlations betweenfar away regions, and ‘wide-base’, which puts focus on correlations between regions thatare close to each other. Each channel arrangements geometry outperforms the other onthe task which exhibits fitting correlations, demonstrating how prior knowledge regard-ing a task at hand may be used to tailor the inductive bias through appropriate channelarrangements. Furthermore, these results demonstrate that the theoretical conclusionsthat were provided in sec 7 for a ConvAC, extend to the common ConvNet architecturewhich involves ReLU activations and max pooling.

on both tasks decreases. This is attributed to the ‘hardness’ of the task relative to the amount ofparameters in the network. For these larger numbers of parameters in the networks, a harder task isexpected to exhibit a more noticeable difference in performance between the two architectures.

Overall, in these experiments we show a notable compliance between a typical length of thedata and an appropriate network design. Our results can therefore be seen as a clear demonstrationof how prior knowledge regarding a task at hand may be used to tailor the inductive bias of a deepconvolutional network by designing the channel widths appropriately subject to the amount of re-sources at hand. We have shown how phenomena that were indicated by the theoretical analysis thatwas presented in this paper in the context of ConvACs, manifest themselves in the most prevalentand successful ConvNet architecture which involves ReLU activations and max pooling.

9. Discussion

The construction of a deep ConvAC in terms of a Tensor Network, is the main theoretical achieve-ment of this paper. This method, which constructs the ConvAC by inner products between low ordertensors rather than by outer products as has been performed to date, enabled us to carry a graph-theoretic analysis of a convolutional network, and tie its expressiveness to a minimal cut in the graphcharacterizing it. Specifically, network architecture related results were drawn, connecting the num-

28

DEEP LEARNING AND QUANTUM ENTANGLEMENT

Figure 9: Samples of the randomly positioned MNIST digits to be classified in the global task(above) and the local task (below).

to size 32 ⇥ 32. In both tasks the digit was to be identified correctly with a label 0, ..., 9. See fig. 9for a sample of images from each task.

We designed two different networks that tackle these two tasks, with a difference in the channelordering scheme that is meant to emphasize the difference between the two tasks in accordance withthe analysis above. In both networks, the first layer is a representation layer — a 3 ⇥ 3 (with stride1) shared convolutional layer. Following it are 6 hidden layers, each with 1 ⇥ 1 shared convolutionkernels followed by ReLU activations and 2 ⇥ 2 max pooling (with stride 2). Classification in bothnetworks was preformed through Y = 10 network outputs, with prediction following the strongestactivation. The difference between the two networks is in the channel ordering — in the ‘wide-base’network they are wider in the beginning and narrow down in the deeper layers while in the ‘wide-tip’ network they follow the opposite trend. Specifically, we set a parameter r to determine eachpair of such networks according to the following scheme:

• Wide-base: [10; 4r; 4r; 2r; 2r; r; r; 10]• Wide-tip: [10; r; r; 2r; 2r; 4r; 4r; 10]

The channel numbers form left to right go from shallow to deep. The channel numbers were chosento be gradually increased/decreased in iterations of two layers at a time as a trade-off — we wantedthe network to be reasonably deep but not to have too many different channel numbers, in order toresemble conventional channel choices. The parameter count for both configurations is identical:10 · r + r · r + r · 2r +2r · 2r +2r · 4r +4r · 4r +4r · 10 = 31r2 +50r. A result compliant with ourtheoretical expectation would be for the ‘wide-base’ network to outperform the ‘wide-tip’ networkin the local classification task, and the converse to occur in the global classification task.

Fig. 10 shows the results of applying both the ‘wide-base’ and ‘wide-tip’ networks to the localand global tasks. Each task consisted of 60000 training images and 10000 test images, in corre-spondence with the MNIST database. Indeed, the ‘wide-base’ network significantly outperformsthe ‘wide-tip’ network in the local classification task, whereas a clear opposite trend can be seenfor the global classification task. This complies with our discussion above, according to which the‘wide-base’ network should be able to support short correlation lengths in the input, whereas the‘wide-tip’ network is predicted to put focus on longer correlation lengths. The low performancerelative to the regular MNIST task is due to the randomization of positions, there is no such degreeof freedom in the regular MNIST task. This is of no concern to us, only the relative performancebetween networks on each task is of interest in order to examine how our claims extend to the com-mon ConvNet architecture. The fact that the global task gets higher accuracies for all choices of r isunsurprising, as it is clearly an easier task. An additional thing to note is that as r grows, the accura-cies of both tasks augment and with it, the difference between the performances of the architectures

27

Page 99: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Time Series PredictionGuo, Jie, Lu, Poletti, PRE 98, 042114

• Product-state input processed by an tensor network model • Output is an MPS, approximated as rank-1 (product) tensor • Like recurrent neural nets, but better results than LSTM!

2

FIG. 1: Training phase: a pair of vectors, ~xi and ~yi, is con-verted to a pair of matrix product states (MPS), X~�

i and Y ~⌧i ,

and used to train a matrix product operator (MPO) W ~⌧~� . Pre-

diction phase: an input vector ~xk is converted to an MPS X~�k

and then multiplied by the trained MPO W ~⌧~� . The resulting

MPS Y ~⌧k is then converted to a vector ~yk.

The paper is organized as follows: in Sec.II we presentthe main parts of the algorithm, in Sec.III we will applyour algorithm to sequence to sequence prediction, bothin discrete (cellular automata) and continuous (coupledmaps) problems, and to a classification task. Last, inSec.IV we draw our conclusions. In the Appendices weprovide further technical details and explanations for themore interested readers.

II. METHOD

In the following section we describe how we implementthe training (or learning phase) of our model and howto test the accuracy of its predictions (testing phase).The MPO algorithm will be able to generate a sequenceof L numbers given another one of same length. Stateddi↵erently, we consider mapping from a vector space X ,with vectors ~xi, to another vector space Y, with vectors~yi, which is given by

~yi = f(~xi), (1)

where f does not need to be linear. The MPO algorithmwould try to provide an accurate approximation ~yi to theexact vector ~yi.

We will focus on the case for which each vector ~xi, ~yihave the same length L and corresponding elements xi,l

and yi,l. The algorithm will be explained in detail inthe following, however, to help the reader we summarizeit in Fig.(1). In the training phase we take a set of Minput and output sequences ~xi and ~yi, where 1 i M .Each sequence ~xi and ~yi is converted to a matrix productstate (MPS) – that is a product of tensors. All these MPSare used to generate/train a mapping from MPS to MPSwhich is known as matrix product operator (MPO).

In contrast, in the testing/prediction phase, seeright portion of Fig.1, we take an input sequence(xk,1, xk,2, . . . , xk,L) (di↵erent from those used in thetraining), convert it to an MPS, multiply it to theMPO trained earlier and this will result in an out-put MPS which is then converted to a sequence(yk,1, yk,2, . . . , yk,L). To compute the accuracy of the al-gorithm, we compare the predicted output sequence withthe exact one.

A. Training

The algorithm aims to provide, given a sequence, an-other sequence from an exponentially large number ofpossible ones. The output will thus be a function ofthe possible output sequences, and it will be used tochoose the output sequence. One key ingredient in theproposed algorithm is to describe inputs and outputs,or their probability distribution, as MPS [36–39]. Forexample, the probability of a given sequence of integernumbers ~� = (�1,�2, . . . ,�L), i.e. P (~�), can be writtenas

P (~�) =X

a0,...,aL

M�1a0,a1

M�2a1,a2

. . .M�LaL�1,aL

, (2)

where �l describe the local degrees of freedom, whosetotal number is d, and the al are auxiliary degrees offreedom needed to take into account correlations betweendi↵erent elements of the sequence. The larger the numberof the auxiliary local degrees of freedom al, also knownas “bond dimension” D, the more accurate is the prob-ability distribution. A single sequence ~� can be writtenas an MPS with D = 1, also known as a product state.Since we rewrite inputs and outputs as MPS the naturalobject to train, which will then be able to map an inputMPS to an output MPS, is an MPO W ~⌧

~� where ~⌧ is asequence of output integer numbers. The MPOs can beparameterized by a product of 4-dimensional tensors

W ~⌧~� =

X

b0,b1,...,bL

W �1,⌧1b0,b1

W �2,⌧2b1,b2

. . .W �L,⌧LbL�1,bL

(3)

Here, the indices bl indicate an auxiliary dimension whichwe refer to as the MPO’s bond dimension DW .In the next sections we are going to describe how to

convert a sequence (also of real numbers) to an MPS andthen how to train an MPO.

1. From sequence to MPS

The first necessary step to train an MPO is to converta sequence (input or output) to an MPS. The input andoutput sequences may belong to di↵erent spaces, for ex-ample an input sequence could be made of words, letters,real numbers or integers. Each of the above can be rep-resented as a vector of finite size d for the input, and d0

7

-5

-2.5

0

1 20 40 60 80 100

0.99

1

1.0110

20

30

FIG. 5: (a) Average position hli, (b) standard deviation �l

and (c) total probability PT versus time for DW = 5, bluecontinuous line, and DW = 20, red dashed line. The exactsolution is represented by the black dotted line. Other pa-rameters are N = 100, M = 20000, kmax = 20, "t = 10�5

and ↵ = 0.001.

line) to DW = 10 (dashed red line) and DW = 20 (dot-dashed yellow line), while the size of the training set iskept to M = 20000. In Fig.6(b) instead, we keep thesame bond dimension DW = 20 and we vary the train-ing data set size between M = 5000 (continuous blueline), M = 10000 (dashed red line) and M = 20000 (dot-dashed yellow line). We observe that the error decreaseswhen both DW and M increase, and " can be as low as5/1000.

1. Comparison to bidirectional LSTM neural networks

Long short-term memory cells are very useful build-ing blocks for recurrent neural networks [49, 50]. Thanksto their ability of remembering and forgetting informa-tion passed to them, they allow to grasp relevant long-range correlations (see Appendix D for a more detaileddescription of this method). They are thus used for nat-ural language processing, to classify, to predict time se-ries and more. Here we consider a bidirectional network[51], in which information is passed from left to right andfrom right to left, with LSTM cells. Such a single layernetwork, despite using as many parameters as the MPOalgorithm, is not able to give accurate results for the evo-lution. This is shown in Fig.6(c), where the error " forthe single layer network, blue continuous line, is plot-ted against time. We then tried a double layer network(which has about twice as many parameters) [52, 53], andthe error was significantly decreased, red-dotted line in

0

0.01

0.02

0.03

0

0.005

0.01

0.015

1 25 50 75 1000

0.05

0.1

FIG. 6: Average error " between exact and predicted evolu-tion, computed using Eq.(14), versus time for di↵erent MPObond dimensions DW (a), or for di↵erent number of trainingdata M (b). In (a) M = 20000 while DW = 5, 10 or 20respectively for the blue continuous line, red dashed line andyellow dotted line. In (b) DW = 20 while M = 5000, 10000or 20000 respectively for the blue continuous line, red dashedline and yellow dotted line. Other parameters are N = 100,kmax = 20, ✏t = 10�5 and ↵ = 0.001. (c) Comparisonwith bidirectional LSTM neural networks. Blue continuouscurve is the error for a single layer bidirectional network whilered-dashed curve is for a double layer bidirectional network.The yellow dotted line is the error for the MPO algorithmwith DW = 20. All three algorithms have been trained withM = 20000 samples and N = 100.

Fig.6(c), but still larger than the one obtained with theMPO algorithm. This shows that, for this task, the MPOalgorithm is more accurate than our implementation ofrecurrent neural networks using LSTM cells.

C. Classification

It has been previously shown that algorithms based onMPS can be useful for classification tasks [26, 28, 29].Here we show how the MPO model can also be used forsuch purpose. In order to explain how to do so e↵ectively,we consider a well known example which is that of cate-gorizing pictures of handwritten digits from the MNISTdata set [54]. In this problem, handwritten digits from0 to 9 are converted in gray scale pictures with 28 ⇥ 28pixels, and the data set is composed of 60000 trainingimages and 10000 testing images, each associated to oneof the ten possible digits. In order to threat this prob-lem we convert each gray scale figure to a sequence of

time series prediction errors

bidirectional-LSTM

MPO

Page 100: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Schematic Learning Process

Marco Trenti, Lorenzo Sestini, Alessio Gianelle, Davide Zuliani, Timo Felser, Donatella Lucchesi, Simone Montangero, arxiv:2004.13747

Quantum-Inspired Machine Learning on High-Energy Physics Data

• Used tree tensor network • Determine charge of quark from events • Compared to neural networks and gave excellent results

Page 101: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Anomaly Detection with Tensor Networks

• Used squared tensor network (locally purified state) • Perform anomaly detection task • Strongly competes with, and for tabular data exceeds performance of

state-of-art neural net approaches

Jinhui Wang, Chase Roberts, Guifre Vidal, Stefan Leichenauer, arxiv:2006.02516

Geometric Anomaly Detection

Model Architecture

Results on Tabular Data

Page 102: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Quantum process tomography with unsupervised learning and tensor networks

• Also used squared tensor network (locally purified state) • Deduce noisy circuit (actually CPTP map) that explains observed data • First scalable approach to quantum process tomography, as far as I

know

Giacomo Torlai, Christopher J. Wood, Atithi Acharya, Giuseppe Carleo, Juan Carrasquilla, Leandro Aolita, arxiv: 2006.02424

Model Architecture

Learning Random Quantum Circuits

Page 103: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Summary & Future Directions

Tensor networks are a natural way to parameterize interesting and powerful machine learning models

• Linear scaling

• Adaptive weights

• Learning data "features"

Benefits:• Interpretability & theory

• Better algorithms

• Quantum computing

Much work to be done: • best approach for various data set types • theory of learning of tensor networks (it is possible!) • developing new learning algorithms

Page 104: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Many Other Works (including Tensor Train Literature)Yuan, Longhao, Qibin Zhao, and Jianting Cao. "Completion of high order tensor data with missing entries via tensor-train decomposition." International Conference on Neural Information Processing. Springer, Cham, 2017.

Yang, Yinchong, Denis Krompass, and Volker Tresp. "Tensor-train recurrent neural networks for video classification." Proceedings of the 34th International Conference on Machine Learning-Volume 70. JMLR. org, 2017

Rose Yu, Stephan Zheng, Anima Anandkumar, Yisong Yue, "Long-term Forecasting using Tensor-Train RNNs", arXiv:1711.00073

Wang, Wenqi, Vaneet Aggarwal, and Shuchin Aeron. "Tensor completion by alternating minimization under the tensor train (TT) model."  arXiv:1609.05587 (2016).

Yuan, Longhao, Qibin Zhao, and Jianting Cao. "High-order tensor completion for data recovery via sparse tensor-train optimization." 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2018

Liu, Ding, et al. "Machine learning by two-dimensional hierarchical tensor networks: A quantum information theoretic perspective on deep architectures." arXiv preprint arXiv:1710.04833 (2017).

Hallam, Andrew, et al. "Compact neural networks based on the multiscale entanglement renormalization ansatz." arXiv preprint arXiv:1711.03357 (2017)

Pestun, Vasily, John Terilla, and Yiannis Vlassopoulos. "Language as a matrix product state." arXiv preprint arXiv:1711.01416 (2017)

Novikov, A., M. Trofimov, and I. Oseledets. "Exponential machines." Bulletin of the Polish Academy of Sciences. Technical Sciences 66.6 (2018)

Page 105: Introduction to Tensor Networks for Machine Learning · Introduction to Tensor Networks for Machine Learning E.M. Stoudenmire Jun 2020 - CMCity. Machine learning galvanizing industry

Some Learning Resources – tensornetwork.org