third generation machine intelligence christopher m. bishop microsoft research, cambridge microsoft...

Post on 21-Dec-2015

215 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Third Generation Machine

IntelligenceChristopher M. BishopMicrosoft Research, Cambridge

Microsoft Research Summer School 2009

First Generation

“Artificial Intelligence” (GOFAI)

Within a generation ... the problem of creating ‘artificial intelligence’ will largely be solved

Marvin Minsky (1967)

Expert Systems– rules devised by humans

Combinatorial explosion

General theme: hand-crafted rules

Second Generation

Neural networks, support vector machines

Difficult to incorporate complex domain knowledge

General theme: black-box statistical models

Third Generation

General theme: deep integration of domainknowledge and statistical learning

Probabilistic graphical models– Bayesian framework– fast inference using local message-passing

Origins: Bayesian networks, decision theory, HMMs, Kalman filters, MRFs, mean field theory, ...

Bayesian Learning

Consistent use of probability to quantify uncertainty

Predictions involve marginalisation, e.g.

Why is prior knowledge important?

y

x

?

Probabilistic Graphical Models

1. New insights into existing models

2. Framework for designing new models

3. Graph-based algorithms for calculation and computation (c.f. Feynman diagrams in physics)

4. Efficient software implementation

Directed graphs to specify the model

Factor graphs for inference and learning

Probability theory + graphs

Directed Graphs

Example: Time Series Modelling

Manchester Asthma and Allergies Study

Chris BishopIain BuchanMarkus SvensénVincent TanJohn Winn

Factor Graphs

From Directed Graph to Factor Graph

Local message-passing

Efficient inference by exploiting factorization:

Factor Trees: Separation

v w x

f1(v,w) f2(w,x)

y

f3(x,y)

z

f4(x,z)

Messages: From Factors To Variables

w x

f2(w,x)

y

f3(x,y)

z

f4(x,z)

Messages: From Variables To Factors

x

f2(w,x)

y

f3(x,y)

z

f4(x,z)

What if marginalisations are not tractable?

True distribution Monte Carlo Variational Bayes

Loopy belief propagation

Expectation propagation

Illustration: Bayesian Ranking

Ralf HerbrichTom MinkaThore Graepel

Two Player Match Outcome Model

y1

2

1 2

s1 s2

Two Team Match Outcome Model

y1

2

t1

t2

s2

s3

s1

s4

Multiple Team Match Outcome Model

s1

s2

s3

s4

t1

y1

2

t2

t3

y2

3

Efficient Approximate Inference

s1

s2

s3

s4

t1

y1

2

t2

t3

y2

3

Gaussian Prior Factors

Ranking Likelihood Factors

Convergence

0

5

10

15

20

25

30

35

40L

ev

el

0 100 200 300 400

Number of Games

char (Elo)

SQLWildman (Elo)

char (TrueSkill™)

SQLWildman (TrueSkill™)

TrueSkillTM

John WinnChris Bishop

research.microsoft.com/infernet

Tom MinkaJohn WinnJohn GuiverAnitha Kannan

Summary

New paradigm for machine intelligence built on:– a Bayesian formulation– probabilistic graphical models– fast inference using local message-passing

Deep integration of domain knowledge and statistical learning

Large-scale application: TrueSkillTM

Toolkit: Infer.NET

http://research.microsoft.com/~cmbishop

top related