microsoft research academic services gtm draftย ยท training quantum boltzmann machines we train...

15

Upload: others

Post on 18-Jul-2020

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Microsoft Research Academic Services GTM Draftย ยท Training quantum Boltzmann Machines We train using gradient descent on the quantum relative entropy (quantum KL divergence). ๐œŽ=
Page 2: Microsoft Research Academic Services GTM Draftย ยท Training quantum Boltzmann Machines We train using gradient descent on the quantum relative entropy (quantum KL divergence). ๐œŽ=

Nathan WiebeWork with Maria Kieferova

Quantum Neural Networks

Page 3: Microsoft Research Academic Services GTM Draftย ยท Training quantum Boltzmann Machines We train using gradient descent on the quantum relative entropy (quantum KL divergence). ๐œŽ=

Machine learning

Page 4: Microsoft Research Academic Services GTM Draftย ยท Training quantum Boltzmann Machines We train using gradient descent on the quantum relative entropy (quantum KL divergence). ๐œŽ=

โ€ข Fast scalable quantum computers will fundamentally change machine learning.

โ€ข Quadratic (or greater) speedups.

โ€ข Better models for data.

โ€ข Improved privacy and security.

Quantum Machine Learning

โ‹ฎ

Page 5: Microsoft Research Academic Services GTM Draftย ยท Training quantum Boltzmann Machines We train using gradient descent on the quantum relative entropy (quantum KL divergence). ๐œŽ=

โ€ข Fast scalable quantum computers will fundamentally change machine learning.

โ€ข Quadratic (or greater) speedups.

โ€ข Better models for data.

โ€ข Improved privacy and security.

Quantum Machine Learning

Quantum Boltzmann Machine Training.

โ‹ฎ

Page 6: Microsoft Research Academic Services GTM Draftย ยท Training quantum Boltzmann Machines We train using gradient descent on the quantum relative entropy (quantum KL divergence). ๐œŽ=

Boltzmann Machines

Class of recurrent neural network that is strongly related to physics.

๐‘ƒ ๐‘ฅ โˆ ๐‘’โˆ’๐ธ(๐‘ฅ)

1 = cat

0 = Not cat

Energy

, 1

, 0

, 1

, 0

๐ธ ๐‘ฅ =

<๐‘–,๐‘—>

๐‘ค๐‘–,๐‘—๐‘›๐‘–๐‘›๐‘—

๐ธ ๐‘ฅ โ†’ ๐ป =

<๐‘–,๐‘—>

๐‘ค๐‘–,๐‘— 1 โˆ’ ๐‘๐‘– 1 โˆ’ ๐‘๐‘—4

๐‘ค = ๐‘Ž๐‘Ÿ๐‘”๐‘š๐‘–๐‘›๐‘ค(๐ท๐พ๐ฟ(๐‘ƒ ๐‘ฅ |๐‘ƒ๐‘ก๐‘Ÿ๐‘Ž๐‘–๐‘›(๐‘ฅ)))

Page 7: Microsoft Research Academic Services GTM Draftย ยท Training quantum Boltzmann Machines We train using gradient descent on the quantum relative entropy (quantum KL divergence). ๐œŽ=

Quantizing the Boltzmann Machine

๐œŒ โˆ ๐‘’โˆ’๐ป

๐ป =

๐‘—

๐‘ค๐‘—๐ป๐‘—

๐‘ค = ๐‘Ž๐‘Ÿ๐‘”๐‘š๐‘–๐‘›๐‘ค ๐‘†(๐œŒ|๐œŒ๐‘ก๐‘Ÿ๐‘Ž๐‘–๐‘›)

Quantum Boltzmann machineslearn directly from quantum states.

They also generate quantum statesthat closely resemble the training data.

๐ป๐‘—Do not commute

Energy

, 1

, 0

, 1

, 0

Classically we have only energy penalties.

Quantumly, we controlboth the energiesand the eigenbasis theenergies are applied in.

Classical Quantum

Page 8: Microsoft Research Academic Services GTM Draftย ยท Training quantum Boltzmann Machines We train using gradient descent on the quantum relative entropy (quantum KL divergence). ๐œŽ=

Training quantum Boltzmann Machines

We train using gradient descent on the quantum relative entropy (quantum KL divergence).

๐œŽ =๐‘’โˆ’ ฯƒ๐‘—๐‘ค๐‘—๐ป๐‘—

๐‘‡๐‘Ÿ(๐‘’โˆ’ฯƒ๐‘—๐‘ค๐‘—๐ป๐‘—)

log ๐œŽ = โˆ’ฯƒ๐‘—๐‘ค๐‘—๐ป๐‘— โˆ’ log ๐‘‡๐‘Ÿ(๐‘’โˆ’ ฯƒ๐‘—๐‘ค๐‘—๐ป๐‘—

๐œ•๐‘ค๐‘—log ๐œŽ = โˆ’๐ป๐‘— + ๐‘‡๐‘Ÿ ๐œŽ๐ป๐‘—

๐œ•๐‘ค๐‘—๐‘† ๐œŒ ๐œŽ = โˆ’Tr ๐œŒ๐ป๐‘— + ๐‘‡๐‘Ÿ ๐œŽ๐ป๐‘— = ๐ป๐‘— ๐‘š๐‘œ๐‘‘๐‘’๐‘™

โˆ’ ๐ป๐‘— ๐‘‘๐‘Ž๐‘ก๐‘Ž

The number of times the trainingdata is accessed and Gibbs states

are prepared scales as ๐‘‚๐‘€2

๐œ–2.

๐‘€ = number of weights.

Page 9: Microsoft Research Academic Services GTM Draftย ยท Training quantum Boltzmann Machines We train using gradient descent on the quantum relative entropy (quantum KL divergence). ๐œŽ=

Relative entropy training

Train

ing

ob

jectiv

e fu

nctio

n

Page 10: Microsoft Research Academic Services GTM Draftย ยท Training quantum Boltzmann Machines We train using gradient descent on the quantum relative entropy (quantum KL divergence). ๐œŽ=

Learning a model for a quantum system

Assume a quantum Hamiltonian thatis a transverse Ising model:

Find, ๐›ผ, ๐›ฝ and ๐›พ thatbest represent the Hamiltonian giventraining data

๐œŒ = ๐‘’โˆ’เทฉ๐ป/๐‘

Want to minimize ฮ” ๐ป = || ๐ป โˆ’ เทฉ๐ป||

Page 11: Microsoft Research Academic Services GTM Draftย ยท Training quantum Boltzmann Machines We train using gradient descent on the quantum relative entropy (quantum KL divergence). ๐œŽ=

Building a quantum associative memory

Quantum Training vectors: ๐‘ฃ๐‘— = ๐œ“๐‘— โŠ—๐‘’๐‘— Data qubits

0

0

Address qubits

Pick ๐ป๐‘— from a universal set of Hamiltonians on 6 qubits. 1000 epochs, learning rate=0.2

Relative entropy โ‰ˆ 10โˆ’3.

Page 12: Microsoft Research Academic Services GTM Draftย ยท Training quantum Boltzmann Machines We train using gradient descent on the quantum relative entropy (quantum KL divergence). ๐œŽ=

Conclusions

Quantum computers enable new forms of machine learning.

Quantum Neural networks can learn directly from quantum data.

This opens up much richer models of data that cannot be efficiently trained using classical computers.

Page 13: Microsoft Research Academic Services GTM Draftย ยท Training quantum Boltzmann Machines We train using gradient descent on the quantum relative entropy (quantum KL divergence). ๐œŽ=

What is machine learning?Machine learning uses computers to find patterns in data,classify data, or perform tasks.

Computers are not usually told explicitly how to do this, or what โ€œfeaturesโ€ to use.

4 0 5

โ‹ฏ

Training Data

???

4

Page 14: Microsoft Research Academic Services GTM Draftย ยท Training quantum Boltzmann Machines We train using gradient descent on the quantum relative entropy (quantum KL divergence). ๐œŽ=
Page 15: Microsoft Research Academic Services GTM Draftย ยท Training quantum Boltzmann Machines We train using gradient descent on the quantum relative entropy (quantum KL divergence). ๐œŽ=

ยฉ 2014 Microsoft Corporation. All rights reserved.