being sensitive to uncertainty

341
Being Sensitive to Uncertainty! Leon Arriola 1 & James Hyman 1 1 Theoretical Division T5–Applied Mathematics and Plasma Physics Previously T7–Mathematical Modeling & Analysis Los Alamos National Laboratory This work was carried out under the auspices of Los Alamos National Security, LLC (LANS), operator of the Los Alamos National Laboratory under Contract No. DE-AC52-06NA25396 with the U.S. Department of Energy. MTBI: Summer 2010 Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Upload: tulane

Post on 19-Nov-2023

0 views

Category:

Documents


0 download

TRANSCRIPT

Being Sensitive to Uncertainty!

Leon Arriola1 & James Hyman1

1Theoretical DivisionT5–Applied Mathematics and Plasma Physics

Previously T7–Mathematical Modeling & Analysis

Los Alamos National LaboratoryThis work was carried out under the auspices of Los Alamos National Security, LLC (LANS), operator of the Los

Alamos National Laboratory under Contract No. DE-AC52-06NA25396 with the U.S. Department of Energy.

MTBI: Summer 2010

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Forward Problem (FP)

Forward problem (FP) takes nominal input parameters p andproduces the associated output solution u.

Forward ProblemInput Parameter pOutput Solution u

or Function(al) J(u)

1

A∼~u = ~b (Linear System of Equations) p ∈ {aij, bi}A∼~u = λ~u (Eigenvalue Problem) p ∈ {aij}d~udt

= ~f (~u, t; p), ~u(0) = ~u0 (Initial Value Problem)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Forward Problem (FP)

Forward problem (FP) takes nominal input parameters p andproduces the associated output solution u.

Forward ProblemInput Parameter pOutput Solution u

or Function(al) J(u)

1

A∼~u = ~b (Linear System of Equations) p ∈ {aij, bi}A∼~u = λ~u (Eigenvalue Problem) p ∈ {aij}d~udt

= ~f (~u, t; p), ~u(0) = ~u0 (Initial Value Problem)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Forward Problem (FP)

Forward problem (FP) takes nominal input parameters p andproduces the associated output solution u.

Forward ProblemInput Parameter pOutput Solution u

or Function(al) J(u)

1

A∼~u = ~b (Linear System of Equations) p ∈ {aij, bi}A∼~u = λ~u (Eigenvalue Problem) p ∈ {aij}d~udt

= ~f (~u, t; p), ~u(0) = ~u0 (Initial Value Problem)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Forward Problem (FP)

Forward problem (FP) takes nominal input parameters p andproduces the associated output solution u.

Forward ProblemInput Parameter pOutput Solution u

or Function(al) J(u)

1

A∼~u = ~b (Linear System of Equations) p ∈ {aij, bi}A∼~u = λ~u (Eigenvalue Problem) p ∈ {aij}d~udt

= ~f (~u, t; p), ~u(0) = ~u0 (Initial Value Problem)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Forward Sensitivity Analysis (FSA)

Forward sensitivity analysis (FSA) introduces perturbations tothe input parameters, via δp and quantifies the subsequentperturbations to the output solution via δu.

Forward Sensitivity AnalysisPerturation of Parameter

p + δp

Perturbation of Output

u + δu orFunction(al) J(u + δu)

1

A∼~u = ~b 7→(

A∼+ δA∼)

(~u + δ~u) = ~b + δ~b

A∼~u = λ~u 7→(

A∼+ δA∼)

(~u + δ~u) = (λ+ δλ) (~u + δ~u)

d~udt

= ~f (~u, t; p) 7→ d[~u + δ~u]dt

= ~f (~u + δ~u, t; p + δp)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Forward Sensitivity Analysis (FSA)

Forward sensitivity analysis (FSA) introduces perturbations tothe input parameters, via δp and quantifies the subsequentperturbations to the output solution via δu.

Forward Sensitivity AnalysisPerturation of Parameter

p + δp

Perturbation of Output

u + δu orFunction(al) J(u + δu)

1

A∼~u = ~b 7→(

A∼+ δA∼)

(~u + δ~u) = ~b + δ~b

A∼~u = λ~u 7→(

A∼+ δA∼)

(~u + δ~u) = (λ+ δλ) (~u + δ~u)

d~udt

= ~f (~u, t; p) 7→ d[~u + δ~u]dt

= ~f (~u + δ~u, t; p + δp)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Forward Sensitivity Analysis (FSA)

Forward sensitivity analysis (FSA) introduces perturbations tothe input parameters, via δp and quantifies the subsequentperturbations to the output solution via δu.

Forward Sensitivity AnalysisPerturation of Parameter

p + δp

Perturbation of Output

u + δu orFunction(al) J(u + δu)

1

A∼~u = ~b 7→(

A∼+ δA∼)

(~u + δ~u) = ~b + δ~b

A∼~u = λ~u 7→(

A∼+ δA∼)

(~u + δ~u) = (λ+ δλ) (~u + δ~u)

d~udt

= ~f (~u, t; p) 7→ d[~u + δ~u]dt

= ~f (~u + δ~u, t; p + δp)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Forward Sensitivity Analysis (FSA)

Forward sensitivity analysis (FSA) introduces perturbations tothe input parameters, via δp and quantifies the subsequentperturbations to the output solution via δu.

Forward Sensitivity AnalysisPerturation of Parameter

p + δp

Perturbation of Output

u + δu orFunction(al) J(u + δu)

1

A∼~u = ~b 7→(

A∼+ δA∼)

(~u + δ~u) = ~b + δ~b

A∼~u = λ~u 7→(

A∼+ δA∼)

(~u + δ~u) = (λ+ δλ) (~u + δ~u)

d~udt

= ~f (~u, t; p) 7→ d[~u + δ~u]dt

= ~f (~u + δ~u, t; p + δp)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Forward Sensitivity Analysis (FSA)

If the solution u is differentiable in the parameters p:

A∼~u = ~b 7→ A∼∂~u∂p

=∂~b∂p−∂A∼∂p~u

A∼~u = λ~u 7→ A∼∂~u∂p

+∂A∼∂p~u = λ

∂~u∂p

+∂λ

∂p~u

d~udt

= ~f (~u, t; p) 7→ ddt

[∂~u∂p

]= D∼~u[~f ]

∂~u∂p

+∂~f∂p

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Forward Sensitivity Analysis (FSA)

If the solution u is differentiable in the parameters p:

A∼~u = ~b 7→ A∼∂~u∂p

=∂~b∂p−∂A∼∂p~u

A∼~u = λ~u 7→ A∼∂~u∂p

+∂A∼∂p~u = λ

∂~u∂p

+∂λ

∂p~u

d~udt

= ~f (~u, t; p) 7→ ddt

[∂~u∂p

]= D∼~u[~f ]

∂~u∂p

+∂~f∂p

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Forward Sensitivity Analysis (FSA)

If the solution u is differentiable in the parameters p:

A∼~u = ~b 7→ A∼∂~u∂p

=∂~b∂p−∂A∼∂p~u

A∼~u = λ~u 7→ A∼∂~u∂p

+∂A∼∂p~u = λ

∂~u∂p

+∂λ

∂p~u

d~udt

= ~f (~u, t; p) 7→ ddt

[∂~u∂p

]= D∼~u[~f ]

∂~u∂p

+∂~f∂p

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Forward Sensitivity Analysis (FSA)

If the solution u is differentiable in the parameters p:

A∼~u = ~b 7→ A∼∂~u∂p

=∂~b∂p−∂A∼∂p~u

A∼~u = λ~u 7→ A∼∂~u∂p

+∂A∼∂p~u = λ

∂~u∂p

+∂λ

∂p~u

d~udt

= ~f (~u, t; p) 7→ ddt

[∂~u∂p

]= D∼~u[~f ]

∂~u∂p

+∂~f∂p

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Uncertainty Quantification (UQ)

Uncertainties in the input parameters enter the model andproduce uncertainty in the output.

Parameter p1

Distribution of p1

Parameter p2

Distribution of p2

The output isn’t just a single value but rather a PDF as well.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Uncertainty Quantification (UQ)

Uncertainties in the input parameters enter the model andproduce uncertainty in the output.

Parameter p1

Distribution of p1

Parameter p2

Distribution of p2

The output isn’t just a single value but rather a PDF as well.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Uncertainty Quantification

Combined distribution for parameters p1 and p2.

Parameters p1 and p2

Distribution of both p1 and p2

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Normalized Sensitivity Index

Define the normalized sensitivity indexes (SI):

Sp := limδp→0

(δuu

)(δpp

) =pu∂u∂p

u 6= 0

If J(u) is a functional of u SI:

SJp := limδp→0

(δJ(u)J(u)

)(δpp

) =p

J(u)∂J(u)∂p

J(u) 6= 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Normalized Sensitivity Index

Define the normalized sensitivity indexes (SI):

Sp := limδp→0

(δuu

)(δpp

) =pu∂u∂p

u 6= 0

If J(u) is a functional of u SI:

SJp := limδp→0

(δJ(u)J(u)

)(δpp

) =p

J(u)∂J(u)∂p

J(u) 6= 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Normalized Sensitivity Index

Define the normalized sensitivity indexes (SI):

Sp := limδp→0

(δuu

)(δpp

) =pu∂u∂p

u 6= 0

If J(u) is a functional of u SI:

SJp := limδp→0

(δJ(u)J(u)

)(δpp

) =p

J(u)∂J(u)∂p

J(u) 6= 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Normalized Sensitivity Index

Define the normalized sensitivity indexes (SI):

Sp := limδp→0

(δuu

)(δpp

) =pu∂u∂p

u 6= 0

If J(u) is a functional of u SI:

SJp := limδp→0

(δJ(u)J(u)

)(δpp

) =p

J(u)∂J(u)∂p

J(u) 6= 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

Consider the linear system of equations

A∼~u = ~b

Input parameters: p ∈ {aij, bi}Output: ~u

A∼~u = ~b 7→ A∼∂~u∂p

=∂~b∂p−∂A∼∂p~u

DDT!: A∼−1A∼

∂~u∂p

= A∼−1

(∂~b∂p−∂A∼∂p~u

)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

Consider the linear system of equations

A∼~u = ~b

Input parameters: p ∈ {aij, bi}Output: ~u

A∼~u = ~b 7→ A∼∂~u∂p

=∂~b∂p−∂A∼∂p~u

DDT!: A∼−1A∼

∂~u∂p

= A∼−1

(∂~b∂p−∂A∼∂p~u

)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

Consider the linear system of equations

A∼~u = ~b

Input parameters: p ∈ {aij, bi}Output: ~u

A∼~u = ~b 7→ A∼∂~u∂p

=∂~b∂p−∂A∼∂p~u

DDT!: A∼−1A∼

∂~u∂p

= A∼−1

(∂~b∂p−∂A∼∂p~u

)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

Consider the linear system of equations

A∼~u = ~b

Input parameters: p ∈ {aij, bi}Output: ~u

A∼~u = ~b 7→ A∼∂~u∂p

=∂~b∂p−∂A∼∂p~u

DDT!: A∼−1A∼

∂~u∂p

= A∼−1

(∂~b∂p−∂A∼∂p~u

)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

Consider the linear system of equations

A∼~u = ~b

Input parameters: p ∈ {aij, bi}Output: ~u

A∼~u = ~b 7→ A∼∂~u∂p

=∂~b∂p−∂A∼∂p~u

DDT!: A∼−1A∼

∂~u∂p

= A∼−1

(∂~b∂p−∂A∼∂p~u

)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

A∼︸︷︷︸N×N

N×1︷︸︸︷∂~u∂p

=∂~b∂p−∂A∼∂p~u

A∼∂~u∂p︸︷︷︸

N×1

=∂~b∂p−∂A∼∂p~u

DDT!:M×N or 1×N︷ ︸︸ ︷Something ·A∼

∂~u∂p︸︷︷︸

N×1

= Something ·

(∂~b∂p−∂A∼∂p~u

)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

A∼︸︷︷︸N×N

N×1︷︸︸︷∂~u∂p

=∂~b∂p−∂A∼∂p~u

A∼∂~u∂p︸︷︷︸

N×1

=∂~b∂p−∂A∼∂p~u

DDT!:M×N or 1×N︷ ︸︸ ︷Something ·A∼

∂~u∂p︸︷︷︸

N×1

= Something ·

(∂~b∂p−∂A∼∂p~u

)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

A∼︸︷︷︸N×N

N×1︷︸︸︷∂~u∂p

=∂~b∂p−∂A∼∂p~u

A∼∂~u∂p︸︷︷︸

N×1

=∂~b∂p−∂A∼∂p~u

DDT!:M×N or 1×N︷ ︸︸ ︷Something ·A∼

∂~u∂p︸︷︷︸

N×1

= Something ·

(∂~b∂p−∂A∼∂p~u

)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

DDT!:

1×N︷︸︸︷~vT ·A∼

∂~u∂p︸︷︷︸

N×1

= ~vT

(∂~b∂p−∂A∼∂p~u

)︸ ︷︷ ︸

1×1

What’s~v???

Answer: I don’t know yet!

Notice that

1×N︷︸︸︷~vT · A∼︸︷︷︸

N×N

is an 1× N vector.

Let~cT := ~vTA∼ in which case A∼T~v = ~c (Adjoint Problem)

What’s~c???

Answer: I don’t know yet–but will shortly!

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

DDT!:

1×N︷︸︸︷~vT ·A∼

∂~u∂p︸︷︷︸

N×1

= ~vT

(∂~b∂p−∂A∼∂p~u

)︸ ︷︷ ︸

1×1

What’s~v???

Answer: I don’t know yet!

Notice that

1×N︷︸︸︷~vT · A∼︸︷︷︸

N×N

is an 1× N vector.

Let~cT := ~vTA∼ in which case A∼T~v = ~c (Adjoint Problem)

What’s~c???

Answer: I don’t know yet–but will shortly!

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

DDT!:

1×N︷︸︸︷~vT ·A∼

∂~u∂p︸︷︷︸

N×1

= ~vT

(∂~b∂p−∂A∼∂p~u

)︸ ︷︷ ︸

1×1

What’s~v???

Answer: I don’t know yet!

Notice that

1×N︷︸︸︷~vT · A∼︸︷︷︸

N×N

is an 1× N vector.

Let~cT := ~vTA∼ in which case A∼T~v = ~c (Adjoint Problem)

What’s~c???

Answer: I don’t know yet–but will shortly!

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

DDT!:

1×N︷︸︸︷~vT ·A∼

∂~u∂p︸︷︷︸

N×1

= ~vT

(∂~b∂p−∂A∼∂p~u

)︸ ︷︷ ︸

1×1

What’s~v???

Answer: I don’t know yet!

Notice that

1×N︷︸︸︷~vT · A∼︸︷︷︸

N×N

is an 1× N vector.

Let~cT := ~vTA∼ in which case A∼T~v = ~c (Adjoint Problem)

What’s~c???

Answer: I don’t know yet–but will shortly!

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

DDT!:

1×N︷︸︸︷~vT ·A∼

∂~u∂p︸︷︷︸

N×1

= ~vT

(∂~b∂p−∂A∼∂p~u

)︸ ︷︷ ︸

1×1

What’s~v???

Answer: I don’t know yet!

Notice that

1×N︷︸︸︷~vT · A∼︸︷︷︸

N×N

is an 1× N vector.

Let~cT := ~vTA∼ in which case A∼T~v = ~c (Adjoint Problem)

What’s~c???

Answer: I don’t know yet–but will shortly!

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

DDT!:

1×N︷︸︸︷~vT ·A∼

∂~u∂p︸︷︷︸

N×1

= ~vT

(∂~b∂p−∂A∼∂p~u

)︸ ︷︷ ︸

1×1

What’s~v???

Answer: I don’t know yet!

Notice that

1×N︷︸︸︷~vT · A∼︸︷︷︸

N×N

is an 1× N vector.

Let~cT := ~vTA∼ in which case A∼T~v = ~c (Adjoint Problem)

What’s~c???

Answer: I don’t know yet–but will shortly!

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

DDT!:

1×N︷︸︸︷~vT ·A∼

∂~u∂p︸︷︷︸

N×1

= ~vT

(∂~b∂p−∂A∼∂p~u

)︸ ︷︷ ︸

1×1

What’s~v???

Answer: I don’t know yet!

Notice that

1×N︷︸︸︷~vT · A∼︸︷︷︸

N×N

is an 1× N vector.

Let~cT := ~vTA∼ in which case A∼T~v = ~c (Adjoint Problem)

What’s~c???

Answer: I don’t know yet–but will shortly!

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

So~vTA∼∂~u∂p

= ~vT

(∂~b∂p−∂A∼∂p~u

)becomes

~cT ∂~u∂p

= ~vT

(∂~b∂p−∂A∼∂p~u

)Notice that

~cT ∂~u∂p

=(c1 c2 · · · cN

)

∂u1

∂p∂u2

∂p...

∂uN

∂p

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

So~vTA∼∂~u∂p

= ~vT

(∂~b∂p−∂A∼∂p~u

)becomes

~cT ∂~u∂p

= ~vT

(∂~b∂p−∂A∼∂p~u

)Notice that

~cT ∂~u∂p

=(c1 c2 · · · cN

)

∂u1

∂p∂u2

∂p...

∂uN

∂p

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

DDT!:

~cT ∂~u∂p

=(c1 c2 · · · cN

)

∂u1

∂p∂u2

∂p...

∂uN

∂p

DDT!:

~cTk∂~u∂p

=(

0 · · · 0 1︸︷︷︸kth column

0 · · · 0)

∂u1

∂p∂u2

∂p...

∂uN

∂p

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

DDT!:

~cT ∂~u∂p

=(c1 c2 · · · cN

)

∂u1

∂p∂u2

∂p...

∂uN

∂p

DDT!:

~cTk∂~u∂p

=(

0 · · · 0 1︸︷︷︸kth column

0 · · · 0)

∂u1

∂p∂u2

∂p...

∂uN

∂p

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

In order to solve A∼∂~u∂p

=∂~b∂p−∂A∼∂p~u

Premultiply both sides by~vTk and define A∼

T~vk = ~ck where

~cTk =

(0 · · · 0 1︸︷︷︸

kth column

0 · · · 0)

The final answer is

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

In order to solve A∼∂~u∂p

=∂~b∂p−∂A∼∂p~u

Premultiply both sides by~vTk and define A∼

T~vk = ~ck where

~cTk =

(0 · · · 0 1︸︷︷︸

kth column

0 · · · 0)

The final answer is

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

In order to solve A∼∂~u∂p

=∂~b∂p−∂A∼∂p~u

Premultiply both sides by~vTk and define A∼

T~vk = ~ck where

~cTk =

(0 · · · 0 1︸︷︷︸

kth column

0 · · · 0)

The final answer is

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

Forward Problem:

A∼~u = ~b

Adjoint Problem::

A∼T~vk = ~ck

Forward Sensitivity

∂uk

∂p= ~vT

k

(∂~b∂p−∂A∼∂p~u

)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

Forward Problem:

A∼~u = ~b

Adjoint Problem::

A∼T~vk = ~ck

Forward Sensitivity

∂uk

∂p= ~vT

k

(∂~b∂p−∂A∼∂p~u

)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Linear System of Equations

Forward Problem:

A∼~u = ~b

Adjoint Problem::

A∼T~vk = ~ck

Forward Sensitivity

∂uk

∂p= ~vT

k

(∂~b∂p−∂A∼∂p~u

)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Deterministic SIR Model

Consider a disease which, after some period of time, confersimmunity or possibly death.Divide the population into one of three distinct states:

Susceptible: SInfected/Infectious: IRemoved/Recovered: R

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Deterministic SIR Model

Consider a disease which, after some period of time, confersimmunity or possibly death.Divide the population into one of three distinct states:

Susceptible: SInfected/Infectious: IRemoved/Recovered: R

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Deterministic SIR Model

Consider a disease which, after some period of time, confersimmunity or possibly death.Divide the population into one of three distinct states:

Susceptible: SInfected/Infectious: IRemoved/Recovered: R

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Deterministic SIR Model

Consider a disease which, after some period of time, confersimmunity or possibly death.Divide the population into one of three distinct states:

Susceptible: SInfected/Infectious: IRemoved/Recovered: R

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Deterministic SIR Model

Consider a disease which, after some period of time, confersimmunity or possibly death.Divide the population into one of three distinct states:

Susceptible: SInfected/Infectious: IRemoved/Recovered: R

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Deterministic SIR Model

Progression of an individual through these states can beschematically described by the directed graph1

S I R

1

Commonly used deterministic SIR model:

dS/dt = −rSI

dI/dt = rSI − µI

dR/dt = µI.

1Stochastic models use MCMC/DAMLeon Arriola & James Hyman Being Sensitive to Uncertainty!

Deterministic SIR Model

Progression of an individual through these states can beschematically described by the directed graph1

S I R

1

Commonly used deterministic SIR model:

dS/dt = −rSI

dI/dt = rSI − µI

dR/dt = µI.

1Stochastic models use MCMC/DAMLeon Arriola & James Hyman Being Sensitive to Uncertainty!

Deterministic SIR Model

Progression of an individual through these states can beschematically described by the directed graph1

S I R

1

Commonly used deterministic SIR model:

dS/dt = −rSI

dI/dt = rSI − µI

dR/dt = µI.

1Stochastic models use MCMC/DAMLeon Arriola & James Hyman Being Sensitive to Uncertainty!

Numerical Solution of SIR Model

Numerical solution where r = 0.25, µ = 0.0025, S0 = 0.9 andI0 = 0.1.

10 20 30 40 50 60 70

0.2

0.4

0.6

0.8

1.0

RHtL

IHtL

SHtL

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

FSE of SIR Model

FSE wrt parameters r and µ

ddt

[∂S∂r

]= −rI

∂S∂r− rS

∂I∂r− SI

ddt

[∂S∂µ

]= −rI

∂S∂µ− rS

∂I∂µ

ddt

[∂I∂r

]= rI

∂S∂r

+ [rS− µ]∂I∂r

+ SI

ddt

[∂I∂µ

]= rI

∂S∂µ

+ [rS− µ]∂I∂µ− I

ddt

[∂R∂r

]= µ

∂I∂r

ddt

[∂R∂µ

]= µ

∂I∂µ

+ I

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

FSE of SIR Model

FSE wrt parameters r and µ

ddt

[∂S∂r

]= −rI

∂S∂r− rS

∂I∂r− SI

ddt

[∂S∂µ

]= −rI

∂S∂µ− rS

∂I∂µ

ddt

[∂I∂r

]= rI

∂S∂r

+ [rS− µ]∂I∂r

+ SI

ddt

[∂I∂µ

]= rI

∂S∂µ

+ [rS− µ]∂I∂µ− I

ddt

[∂R∂r

]= µ

∂I∂r

ddt

[∂R∂µ

]= µ

∂I∂µ

+ I

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

FSE of SIR Model

What are the ICs?

Suppose that we want ∂I/∂r

ICs are∂I∂r

∣∣∣∣∣t=0

= 1

All others are set to zero

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

FSE of SIR Model

What are the ICs?

Suppose that we want ∂I/∂r

ICs are∂I∂r

∣∣∣∣∣t=0

= 1

All others are set to zero

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

FSE of SIR Model

What are the ICs?

Suppose that we want ∂I/∂r

ICs are∂I∂r

∣∣∣∣∣t=0

= 1

All others are set to zero

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

FSE of SIR Model

What are the ICs?

Suppose that we want ∂I/∂r

ICs are∂I∂r

∣∣∣∣∣t=0

= 1

All others are set to zero

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Numerical Solution of Sensitivity Indices

Time dependent sensitivity index of I wrt r and µ

0 10 20 30 40 50 60 70−0.5

0

0.5

1

1.5 Sensitivity Index Ir Sensitivity Index I

μ

For t ≤ 30, I is most sensitive to changes in the parameter r, andalmost unaffected for t > 30.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Numerical Solution of Sensitivity Indices

Time dependent sensitivity index of I wrt r and µ

0 10 20 30 40 50 60 70−0.5

0

0.5

1

1.5 Sensitivity Index Ir Sensitivity Index I

μ

For t ≤ 30, I is most sensitive to changes in the parameter r, andalmost unaffected for t > 30.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

FSE of SIR ModelFSE wrt initial conditions

d

dt

[∂S

∂S0

]= −rI

∂S

∂S0− rS

∂I

∂S0

d

dt

[∂S

∂I0

]= −rI

∂S

∂I0− rS

∂I

∂I0

d

dt

[∂S

∂R0

]= −rI

∂S

∂R0− rS

∂I

∂R0

d

dt

[∂I

∂S0

]= rI

∂S

∂S0+ [rS − µ]

∂I

∂S0

d

dt

[∂I

∂I0

]= rI

∂S

∂I0+ [rS − µ]

∂I

∂I0

d

dt

[∂I

∂R0

]= rI

∂S

∂R0+ [rS − µ]

∂I

∂R0

d

dt

[∂R

∂S0

]= µ

∂I

∂S0

d

dt

[∂R

∂I0

]= µ

∂I

∂I0

d

dt

[∂R

∂R0

]= µ

∂I

∂R0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Proliferation of FSE’s

In order to calculate the sensitivity indexes, we first had tocalculate the solutions to the system of three ODEs.

To do a full FSA, we must solve a total of 18 equations.

In modeling the chemical kinetics of certain reactions, it wouldnot be unreasonable to have 10 equations with 20 parameters.

To do a full FSA would require solving a total of 310 odes.

Huge increase in the number of equations is a significantcomputational burden.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Proliferation of FSE’s

In order to calculate the sensitivity indexes, we first had tocalculate the solutions to the system of three ODEs.

To do a full FSA, we must solve a total of 18 equations.

In modeling the chemical kinetics of certain reactions, it wouldnot be unreasonable to have 10 equations with 20 parameters.

To do a full FSA would require solving a total of 310 odes.

Huge increase in the number of equations is a significantcomputational burden.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Proliferation of FSE’s

In order to calculate the sensitivity indexes, we first had tocalculate the solutions to the system of three ODEs.

To do a full FSA, we must solve a total of 18 equations.

In modeling the chemical kinetics of certain reactions, it wouldnot be unreasonable to have 10 equations with 20 parameters.

To do a full FSA would require solving a total of 310 odes.

Huge increase in the number of equations is a significantcomputational burden.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Proliferation of FSE’s

In order to calculate the sensitivity indexes, we first had tocalculate the solutions to the system of three ODEs.

To do a full FSA, we must solve a total of 18 equations.

In modeling the chemical kinetics of certain reactions, it wouldnot be unreasonable to have 10 equations with 20 parameters.

To do a full FSA would require solving a total of 310 odes.

Huge increase in the number of equations is a significantcomputational burden.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Proliferation of FSE’s

In order to calculate the sensitivity indexes, we first had tocalculate the solutions to the system of three ODEs.

To do a full FSA, we must solve a total of 18 equations.

In modeling the chemical kinetics of certain reactions, it wouldnot be unreasonable to have 10 equations with 20 parameters.

To do a full FSA would require solving a total of 310 odes.

Huge increase in the number of equations is a significantcomputational burden.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Forward Sensitivity Analysis (FSA)

p15

∂u73

∂p15

∂u19

∂p15

∂u4

∂p15

∂u23

∂p15

∂u36

∂p15

∂u7

∂p15

∂u19

∂p15

Parameter/Input Space Solution/Output Space

1

FSA is used when the number of output/solution variablesof interest greatly exceeds the number of inputs/parameters.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Adjoint Sensitivity Analysis (ASA)

p10

p132

p93

p45

p4

p26

p8

∂u19

∂pi

Parameter/Input Space Solution/Output Space

1

ASA is used when the number of parameters/inputs ofinterest greatly exceeds the number of outputs/solutions.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Adjoint Sensitivity of Functionals for ODEs/IVP

Forward problem:

d~udt

= ~F[~u(t;~p)], ~u(0) = ~u0

~u is an n× 1 forward solution vector and ~p is an (k + n)× 1vector which represents any of the k parameters or n initialconditions associated with the problem.

FSE

ddt

[D∼~p[~u]

]= D∼~u[~F] · D∼~p[~u] + D∼~p[~F]

where D∼ are Jacobians.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Adjoint Sensitivity of Functionals for ODEs/IVP

Forward problem:

d~udt

= ~F[~u(t;~p)], ~u(0) = ~u0

~u is an n× 1 forward solution vector and ~p is an (k + n)× 1vector which represents any of the k parameters or n initialconditions associated with the problem.

FSE

ddt

[D∼~p[~u]

]= D∼~u[~F] · D∼~p[~u] + D∼~p[~F]

where D∼ are Jacobians.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Adjoint Sensitivity of Functionals for ODEs/IVP

Forward problem:

d~udt

= ~F[~u(t;~p)], ~u(0) = ~u0

~u is an n× 1 forward solution vector and ~p is an (k + n)× 1vector which represents any of the k parameters or n initialconditions associated with the problem.

FSE

ddt

[D∼~p[~u]

]= D∼~u[~F] · D∼~p[~u] + D∼~p[~F]

where D∼ are Jacobians.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Adjoint Sensitivity of Functionals for ODEs/IVP

Forward problem:

d~udt

= ~F[~u(t;~p)], ~u(0) = ~u0

~u is an n× 1 forward solution vector and ~p is an (k + n)× 1vector which represents any of the k parameters or n initialconditions associated with the problem.

FSE

ddt

[D∼~p[~u]

]= D∼~u[~F] · D∼~p[~u] + D∼~p[~F]

where D∼ are Jacobians.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Adjoint Sensitivity of Functionals for ODEs/IVP

FSEddt

[D∼~p[~u]

]= D∼~u[~F] · D∼~p[~u] + D∼~p[~F]

Determine the sensitivity of an associated functional J(~u) of thesolution ~u where g and h are given scalar functions:

J [~u] :=

b∫t=0

g(~u,~p) dt + h(~u,~p)∣∣∣∣t=b

FSE for the functional J(~u)

~∇~p[J] =

b∫t=0

(D∼~p

T [~u] · ~∇~u[g] + ~∇~p[g])

dt

+(

D∼~pT [~u] · ~∇~u[h] + ~∇~p[h]

) ∣∣∣∣t=b

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Adjoint Sensitivity of Functionals for ODEs/IVP

FSEddt

[D∼~p[~u]

]= D∼~u[~F] · D∼~p[~u] + D∼~p[~F]

Determine the sensitivity of an associated functional J(~u) of thesolution ~u where g and h are given scalar functions:

J [~u] :=

b∫t=0

g(~u,~p) dt + h(~u,~p)∣∣∣∣t=b

FSE for the functional J(~u)

~∇~p[J] =

b∫t=0

(D∼~p

T [~u] · ~∇~u[g] + ~∇~p[g])

dt

+(

D∼~pT [~u] · ~∇~u[h] + ~∇~p[h]

) ∣∣∣∣t=b

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Adjoint Sensitivity of Functionals for ODEs/IVP

FSEddt

[D∼~p[~u]

]= D∼~u[~F] · D∼~p[~u] + D∼~p[~F]

Determine the sensitivity of an associated functional J(~u) of thesolution ~u where g and h are given scalar functions:

J [~u] :=

b∫t=0

g(~u,~p) dt + h(~u,~p)∣∣∣∣t=b

FSE for the functional J(~u)

~∇~p[J] =

b∫t=0

(D∼~p

T [~u] · ~∇~u[g] + ~∇~p[g])

dt

+(

D∼~pT [~u] · ~∇~u[h] + ~∇~p[h]

) ∣∣∣∣t=b

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Adjoint Sensitivity of Functionals for ODEs/IVP

FSEddt

[D∼~p[~u]

]= D∼~u[~F] · D∼~p[~u] + D∼~p[~F]

Determine the sensitivity of an associated functional J(~u) of thesolution ~u where g and h are given scalar functions:

J [~u] :=

b∫t=0

g(~u,~p) dt + h(~u,~p)∣∣∣∣t=b

FSE for the functional J(~u)

~∇~p[J] =

b∫t=0

(D∼~p

T [~u] · ~∇~u[g] + ~∇~p[g])

dt

+(

D∼~pT [~u] · ~∇~u[h] + ~∇~p[h]

) ∣∣∣∣t=b

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Adjoint Sensitivity of Functionals for ODEs/IVP

FSEddt

[D∼~p[~u]

]= D∼~u[~F] · D∼~p[~u] + D∼~p[~F]

Determine the sensitivity of an associated functional J(~u) of thesolution ~u where g and h are given scalar functions:

J [~u] :=

b∫t=0

g(~u,~p) dt + h(~u,~p)∣∣∣∣t=b

FSE for the functional J(~u)

~∇~p[J] =

b∫t=0

(D∼~p

T [~u] · ~∇~u[g] + ~∇~p[g])

dt

+(

D∼~pT [~u] · ~∇~u[h] + ~∇~p[h]

) ∣∣∣∣t=b

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Adjoint Sensitivity of Functionals for ODEs/IVP

~∇~p[J] =

b∫t=0

(D∼~p

T [~u] · ~∇~u[g] + ~∇~p[g])

dt+(

D∼~pT [~u] · ~∇~u[h] + ~∇~p[h]

) ∣∣∣∣t=b

We wish to eliminate having to directly calculate D∼~pT [~u]

FSEddt

[D∼~p[~u]

]− D∼~u[~F] · D∼~p[~u]− D∼~p[~F] = 0

Define the standard inner product 〈A∼,~b〉 :=

b∫t=0

~b T(t) · A∼(t) dt

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Adjoint Sensitivity of Functionals for ODEs/IVP

~∇~p[J] =

b∫t=0

(D∼~p

T [~u] · ~∇~u[g] + ~∇~p[g])

dt+(

D∼~pT [~u] · ~∇~u[h] + ~∇~p[h]

) ∣∣∣∣t=b

We wish to eliminate having to directly calculate D∼~pT [~u]

FSEddt

[D∼~p[~u]

]− D∼~u[~F] · D∼~p[~u]− D∼~p[~F] = 0

Define the standard inner product 〈A∼,~b〉 :=

b∫t=0

~b T(t) · A∼(t) dt

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Adjoint Sensitivity of Functionals for ODEs/IVP

~∇~p[J] =

b∫t=0

(D∼~p

T [~u] · ~∇~u[g] + ~∇~p[g])

dt+(

D∼~pT [~u] · ~∇~u[h] + ~∇~p[h]

) ∣∣∣∣t=b

We wish to eliminate having to directly calculate D∼~pT [~u]

FSEddt

[D∼~p[~u]

]− D∼~u[~F] · D∼~p[~u]− D∼~p[~F] = 0

Define the standard inner product 〈A∼,~b〉 :=

b∫t=0

~b T(t) · A∼(t) dt

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Adjoint Sensitivity of Functionals for ODEs/IVP

Let~v be an unspecified adjoint variable and take the inner product⟨ddt

[D∼~p[~u]

]− D∼~u[~F] · D∼~p[~u]− D∼~p[~F],~v

⟩=⟨~0,~v⟩

= 0

b∫t=0

~vT(

ddt

[D∼~p[~u]

]− D∼~u[~F] · D∼~p[~u]− D∼~p[~F]

)dt = 0

Derivative shift of~v T x ddt

[D∼~p[~u]

]using integration by parts

~v TD∼~p[~u]

∣∣∣∣∣b

t=0

+

b∫t=0

(−d~v T

dt−~v TD∼~u[~F]

)D∼~p[~u] dt−

b∫t=0

~v TD∼~p[~F] dt = 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Adjoint Sensitivity of Functionals for ODEs/IVP

Let~v be an unspecified adjoint variable and take the inner product⟨ddt

[D∼~p[~u]

]− D∼~u[~F] · D∼~p[~u]− D∼~p[~F],~v

⟩=⟨~0,~v⟩

= 0

b∫t=0

~vT(

ddt

[D∼~p[~u]

]− D∼~u[~F] · D∼~p[~u]− D∼~p[~F]

)dt = 0

Derivative shift of~v T x ddt

[D∼~p[~u]

]using integration by parts

~v TD∼~p[~u]

∣∣∣∣∣b

t=0

+

b∫t=0

(−d~v T

dt−~v TD∼~u[~F]

)D∼~p[~u] dt−

b∫t=0

~v TD∼~p[~F] dt = 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Adjoint Sensitivity of Functionals for ODEs/IVP

Let~v be an unspecified adjoint variable and take the inner product⟨ddt

[D∼~p[~u]

]− D∼~u[~F] · D∼~p[~u]− D∼~p[~F],~v

⟩=⟨~0,~v⟩

= 0

b∫t=0

~vT(

ddt

[D∼~p[~u]

]− D∼~u[~F] · D∼~p[~u]− D∼~p[~F]

)dt = 0

Derivative shift of~v T x ddt

[D∼~p[~u]

]using integration by parts

~v TD∼~p[~u]

∣∣∣∣∣b

t=0

+

b∫t=0

(−d~v T

dt−~v TD∼~u[~F]

)D∼~p[~u] dt−

b∫t=0

~v TD∼~p[~F] dt = 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Adjoint Sensitivity of Functionals for ODEs/IVP

FS of the functional J & inner product condition

~∇~p[J] =

b∫t=0

(D∼~p

T [~u] · ~∇~u[g]︸ ︷︷ ︸Compare this expression

+ ~∇~p[g])

dt+

(~DT~p [~u] · ~∇~u[h] + ~∇~p[h]

) ∣∣∣∣t=b

~v TD∼~p[~u]

∣∣∣∣∣b

t=0

+

b∫t=0

with this expression︷ ︸︸ ︷(−d~v T

dt−~v TD∼~u[~F]

)D∼~p[~u] dt−

b∫t=0

~v TD∼~p[~F] dt = 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Adjoint Sensitivity of Functionals for ODEs/IVP

Take transpose and compare terms((−d~v T

dt−~v TD∼~u[~F]

)D∼~p[~u]

)T

= D∼~pT [~u]

(−d~v

dt− D∼~u

T [~F]~v)

lD∼~p

T [~u] · ~∇~u[g]

Define the adjoint problem

d~vdt

+ D∼~uT [~F]~v := −~∇~u[g]

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Adjoint Sensitivity of Functionals for ODEs/IVP

Take transpose and compare terms((−d~v T

dt−~v TD∼~u[~F]

)D∼~p[~u]

)T

= D∼~pT [~u]

(−d~v

dt− D∼~u

T [~F]~v)

lD∼~p

T [~u] · ~∇~u[g]

Define the adjoint problem

d~vdt

+ D∼~uT [~F]~v := −~∇~u[g]

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Adjoint Sensitivity of Functionals for ODEs/IVP

Take transpose and substitute

~∇~p[J] =

b∫t=0

(D∼~p

T [~F]~v + ~∇~p[g])

dt − D∼T~p

[~u]~v

∣∣∣∣∣b

t=0

+(

D∼~pT [~u]~∇~u[h] + ~∇~p[h]

) ∣∣∣∣∣t=b

with adjoint problemd~vdt

+ D∼~uT [~F]~v := −~∇~u[g],

and forward problemd~udt

= ~F[~u(t;~p)], ~u(0) = ~u0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Adjoint Sensitivity of Functionals for ODEs/IVP

Take transpose and substitute

~∇~p[J] =

b∫t=0

(D∼~p

T [~F]~v + ~∇~p[g])

dt − D∼T~p

[~u]~v

∣∣∣∣∣b

t=0

+(

D∼~pT [~u]~∇~u[h] + ~∇~p[h]

) ∣∣∣∣∣t=b

with adjoint problemd~vdt

+ D∼~uT [~F]~v := −~∇~u[g],

and forward problemd~udt

= ~F[~u(t;~p)], ~u(0) = ~u0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Adjoint Sensitivity of Functionals for ODEs/IVP

Take transpose and substitute

~∇~p[J] =

b∫t=0

(D∼~p

T [~F]~v + ~∇~p[g])

dt − D∼T~p

[~u]~v

∣∣∣∣∣b

t=0

+(

D∼~pT [~u]~∇~u[h] + ~∇~p[h]

) ∣∣∣∣∣t=b

with adjoint problemd~vdt

+ D∼~uT [~F]~v := −~∇~u[g],

and forward problemd~udt

= ~F[~u(t;~p)], ~u(0) = ~u0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem

In 1952, Harry Markowitz published a seminal paper titled“Portfolio Selection” which laid the foundation for what is nowcalled modern portfolio theory.

Constructed the mathematical framework for the well known andaccepted observation that investors, although seeking amaximum return on their investments, also simultaneously wantto minimize the associated risk.

The proper mixture of various investments can significantlyreduce the overall volatility of the portfolio, while maintaining a”high” rate of return.

Quantitatively provide two solutions: a maximum amount ofreturn for a given level of risk, or a minimum level of riskfor a given amount of return.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem

In 1952, Harry Markowitz published a seminal paper titled“Portfolio Selection” which laid the foundation for what is nowcalled modern portfolio theory.

Constructed the mathematical framework for the well known andaccepted observation that investors, although seeking amaximum return on their investments, also simultaneously wantto minimize the associated risk.

The proper mixture of various investments can significantlyreduce the overall volatility of the portfolio, while maintaining a”high” rate of return.

Quantitatively provide two solutions: a maximum amount ofreturn for a given level of risk, or a minimum level of riskfor a given amount of return.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem

In 1952, Harry Markowitz published a seminal paper titled“Portfolio Selection” which laid the foundation for what is nowcalled modern portfolio theory.

Constructed the mathematical framework for the well known andaccepted observation that investors, although seeking amaximum return on their investments, also simultaneously wantto minimize the associated risk.

The proper mixture of various investments can significantlyreduce the overall volatility of the portfolio, while maintaining a”high” rate of return.

Quantitatively provide two solutions: a maximum amount ofreturn for a given level of risk, or a minimum level of riskfor a given amount of return.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem

In 1952, Harry Markowitz published a seminal paper titled“Portfolio Selection” which laid the foundation for what is nowcalled modern portfolio theory.

Constructed the mathematical framework for the well known andaccepted observation that investors, although seeking amaximum return on their investments, also simultaneously wantto minimize the associated risk.

The proper mixture of various investments can significantlyreduce the overall volatility of the portfolio, while maintaining a”high” rate of return.

Quantitatively provide two solutions: a maximum amount ofreturn for a given level of risk, or a minimum level of riskfor a given amount of return.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Wheat Selection

Since cereal grains, such as wheat, provide a substantial portionof the caloric needs of humans worldwide, issues such as diseasemanagement and prevention are of the utmost importance.

Effects of soil type, average rainfall, disease tolerance, etc., onthe yield, and hence the bottom line.

To further complicate the problem, agricultural researchers areattempting to produce perennial grain crops that will displace theannual crops that are currently planted.

The commonly used practices, that reduce disease inoculum inannual crops, such as tillage, delayed planting, or crop rotation,are not applicable to perennial crops.

Farmers would need to plant blends of seeds from amixture of cultivars (varieties).

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Wheat Selection

Since cereal grains, such as wheat, provide a substantial portionof the caloric needs of humans worldwide, issues such as diseasemanagement and prevention are of the utmost importance.

Effects of soil type, average rainfall, disease tolerance, etc., onthe yield, and hence the bottom line.

To further complicate the problem, agricultural researchers areattempting to produce perennial grain crops that will displace theannual crops that are currently planted.

The commonly used practices, that reduce disease inoculum inannual crops, such as tillage, delayed planting, or crop rotation,are not applicable to perennial crops.

Farmers would need to plant blends of seeds from amixture of cultivars (varieties).

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Wheat Selection

Since cereal grains, such as wheat, provide a substantial portionof the caloric needs of humans worldwide, issues such as diseasemanagement and prevention are of the utmost importance.

Effects of soil type, average rainfall, disease tolerance, etc., onthe yield, and hence the bottom line.

To further complicate the problem, agricultural researchers areattempting to produce perennial grain crops that will displace theannual crops that are currently planted.

The commonly used practices, that reduce disease inoculum inannual crops, such as tillage, delayed planting, or crop rotation,are not applicable to perennial crops.

Farmers would need to plant blends of seeds from amixture of cultivars (varieties).

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Wheat Selection

Since cereal grains, such as wheat, provide a substantial portionof the caloric needs of humans worldwide, issues such as diseasemanagement and prevention are of the utmost importance.

Effects of soil type, average rainfall, disease tolerance, etc., onthe yield, and hence the bottom line.

To further complicate the problem, agricultural researchers areattempting to produce perennial grain crops that will displace theannual crops that are currently planted.

The commonly used practices, that reduce disease inoculum inannual crops, such as tillage, delayed planting, or crop rotation,are not applicable to perennial crops.

Farmers would need to plant blends of seeds from amixture of cultivars (varieties).

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Wheat Selection

Since cereal grains, such as wheat, provide a substantial portionof the caloric needs of humans worldwide, issues such as diseasemanagement and prevention are of the utmost importance.

Effects of soil type, average rainfall, disease tolerance, etc., onthe yield, and hence the bottom line.

To further complicate the problem, agricultural researchers areattempting to produce perennial grain crops that will displace theannual crops that are currently planted.

The commonly used practices, that reduce disease inoculum inannual crops, such as tillage, delayed planting, or crop rotation,are not applicable to perennial crops.

Farmers would need to plant blends of seeds from amixture of cultivars (varieties).

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Wheat Selection

In the jargon of modern portfolio theory, investment in securities,stocks or bonds is replaced with the planting of multiple wheatcultivars.The objective of maximizing the expected rate of return on theinvestments is replaced with maximizing the wheat yield.Minimize the financial risks is replaced by minimizing thevariation in wheat yield due to “genotype–environmentinteraction,” i.e.,, how each cultivar responds to the inevitableunpredictable environmental conditions.Risk is defined in terms of the standard deviation/variance of thereturn on the assets, and is in fact a quadratic functional.Once quantitative values can be established for the average yield,as well as the variance and covariance of yields of eachcultivar, an optimal portfolio is found by solving aQuadratic Programming Problem (QPP)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Wheat Selection

In the jargon of modern portfolio theory, investment in securities,stocks or bonds is replaced with the planting of multiple wheatcultivars.The objective of maximizing the expected rate of return on theinvestments is replaced with maximizing the wheat yield.Minimize the financial risks is replaced by minimizing thevariation in wheat yield due to “genotype–environmentinteraction,” i.e.,, how each cultivar responds to the inevitableunpredictable environmental conditions.Risk is defined in terms of the standard deviation/variance of thereturn on the assets, and is in fact a quadratic functional.Once quantitative values can be established for the average yield,as well as the variance and covariance of yields of eachcultivar, an optimal portfolio is found by solving aQuadratic Programming Problem (QPP)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Wheat Selection

In the jargon of modern portfolio theory, investment in securities,stocks or bonds is replaced with the planting of multiple wheatcultivars.The objective of maximizing the expected rate of return on theinvestments is replaced with maximizing the wheat yield.Minimize the financial risks is replaced by minimizing thevariation in wheat yield due to “genotype–environmentinteraction,” i.e.,, how each cultivar responds to the inevitableunpredictable environmental conditions.Risk is defined in terms of the standard deviation/variance of thereturn on the assets, and is in fact a quadratic functional.Once quantitative values can be established for the average yield,as well as the variance and covariance of yields of eachcultivar, an optimal portfolio is found by solving aQuadratic Programming Problem (QPP)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Wheat Selection

In the jargon of modern portfolio theory, investment in securities,stocks or bonds is replaced with the planting of multiple wheatcultivars.The objective of maximizing the expected rate of return on theinvestments is replaced with maximizing the wheat yield.Minimize the financial risks is replaced by minimizing thevariation in wheat yield due to “genotype–environmentinteraction,” i.e.,, how each cultivar responds to the inevitableunpredictable environmental conditions.Risk is defined in terms of the standard deviation/variance of thereturn on the assets, and is in fact a quadratic functional.Once quantitative values can be established for the average yield,as well as the variance and covariance of yields of eachcultivar, an optimal portfolio is found by solving aQuadratic Programming Problem (QPP)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Wheat Selection

In the jargon of modern portfolio theory, investment in securities,stocks or bonds is replaced with the planting of multiple wheatcultivars.The objective of maximizing the expected rate of return on theinvestments is replaced with maximizing the wheat yield.Minimize the financial risks is replaced by minimizing thevariation in wheat yield due to “genotype–environmentinteraction,” i.e.,, how each cultivar responds to the inevitableunpredictable environmental conditions.Risk is defined in terms of the standard deviation/variance of thereturn on the assets, and is in fact a quadratic functional.Once quantitative values can be established for the average yield,as well as the variance and covariance of yields of eachcultivar, an optimal portfolio is found by solving aQuadratic Programming Problem (QPP)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem

Definition (QPP)The QPP is defined as

Maximize J(u1, . . . , un) := ~cT~u− 12~u

T Q∼~u, 3

a11u1 + a12u2 + · · ·+ a1nun ≤ b1...

am1u1 + am2u2 + · · ·+ amnun ≤ bm

u1, . . . , un ≥ 0

Q∼ symmetric, positive semi–definite matrix

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem

Maximize the quadratic objective function

J(u1, . . . , un) := ~cT~u− 12~uT Q∼~u

subject to the constraints

A∼~u ≤~b

with nonnegativity conditions

u1, . . . , un ≥ 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem

Maximize the quadratic objective function

J(u1, . . . , un) := ~cT~u− 12~uT Q∼~u

subject to the constraints

A∼~u ≤~b

with nonnegativity conditions

u1, . . . , un ≥ 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem

Maximize the quadratic objective function

J(u1, . . . , un) := ~cT~u− 12~uT Q∼~u

subject to the constraints

A∼~u ≤~b

with nonnegativity conditions

u1, . . . , un ≥ 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

General Optimization Problem

Maximize/Minimize a given objective function

J(~u) = F(u1, . . . , un)

subject to the K equality and L inequality constraints

fk(~u) = 0 where k = 1, . . . ,K

gl(~u) ≤ 0 where l = 1, . . . ,L.

Define the modified Lagrangian function by forming a linearcombination of the objective functional and the constraints as

L (~u;µ, λ) := J(~u) +K∑

k=1

µkfk(~u) +L∑

l=1

λlgl(~u),

where µk and λl are called the Lagrange multipliers.The Lagrange multipliers are in fact adjoint variables.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

General Optimization Problem

Maximize/Minimize a given objective function

J(~u) = F(u1, . . . , un)

subject to the K equality and L inequality constraints

fk(~u) = 0 where k = 1, . . . ,K

gl(~u) ≤ 0 where l = 1, . . . ,L.

Define the modified Lagrangian function by forming a linearcombination of the objective functional and the constraints as

L (~u;µ, λ) := J(~u) +K∑

k=1

µkfk(~u) +L∑

l=1

λlgl(~u),

where µk and λl are called the Lagrange multipliers.The Lagrange multipliers are in fact adjoint variables.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

General Optimization Problem

Maximize/Minimize a given objective function

J(~u) = F(u1, . . . , un)

subject to the K equality and L inequality constraints

fk(~u) = 0 where k = 1, . . . ,K

gl(~u) ≤ 0 where l = 1, . . . ,L.

Define the modified Lagrangian function by forming a linearcombination of the objective functional and the constraints as

L (~u;µ, λ) := J(~u) +K∑

k=1

µkfk(~u) +L∑

l=1

λlgl(~u),

where µk and λl are called the Lagrange multipliers.The Lagrange multipliers are in fact adjoint variables.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Karush/Kuhn/Tucker Theorem

Theorem (Karush/Kuhn/Tucker Theorem)An optimal solution is found by solving the associated equations

∂J(~u∗)∂uj

+K∑

k=1

µk∂fk(~u∗)∂uj

+L∑

l=1

λl∂gl(~u∗)∂uj

= 0 for j = 1, . . . n

µkfk(~u∗) = 0 for k = 1, . . .L

λlgl(~u∗) = 0 for l = 1, . . .L

where ~u∗ is the optimal solution.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem

The inequality constraints are transformed into equalityconstraints by the introduction of slack variables

Construct the extended Lagrange function

L := ~cT~u− 12~uT Q∼~u +~vT

~b− A∼~u−

(s1)2

(s2)2

...(sm)2

.

The optimal solution occurs at a critical point of the Lagrangefunction:

∂L∂uj

= 0,∂L∂si

= 0, and∂L∂vi

= 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem

The inequality constraints are transformed into equalityconstraints by the introduction of slack variables

Construct the extended Lagrange function

L := ~cT~u− 12~uT Q∼~u +~vT

~b− A∼~u−

(s1)2

(s2)2

...(sm)2

.

The optimal solution occurs at a critical point of the Lagrangefunction:

∂L∂uj

= 0,∂L∂si

= 0, and∂L∂vi

= 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem

The inequality constraints are transformed into equalityconstraints by the introduction of slack variables

Construct the extended Lagrange function

L := ~cT~u− 12~uT Q∼~u +~vT

~b− A∼~u−

(s1)2

(s2)2

...(sm)2

.

The optimal solution occurs at a critical point of the Lagrangefunction:

∂L∂uj

= 0,∂L∂si

= 0, and∂L∂vi

= 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem

These equations respectively reduce to the mixednonhomogeneous adjoint problem:

A∼T~v = ~c− Q∼~u,

the orthogonality conditions

visi = 0, for i = 1, . . .m,

and lastly to the forward problem

A∼~u +

(s1)2

(s2)2

...(sm)2

= ~b.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem

These equations respectively reduce to the mixednonhomogeneous adjoint problem:

A∼T~v = ~c− Q∼~u,

the orthogonality conditions

visi = 0, for i = 1, . . .m,

and lastly to the forward problem

A∼~u +

(s1)2

(s2)2

...(sm)2

= ~b.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem

These equations respectively reduce to the mixednonhomogeneous adjoint problem:

A∼T~v = ~c− Q∼~u,

the orthogonality conditions

visi = 0, for i = 1, . . .m,

and lastly to the forward problem

A∼~u +

(s1)2

(s2)2

...(sm)2

= ~b.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem

Let p denote any of the parameters aij, bi, cj, or qij, where qij

denotes the i, j entry of the matrix Q∼.Differentiate the objective function, wrt parameter p:

∂J∂p

=∂~cT

∂p~u− 1

2~uT∂Q∼∂p~u +

12

(2~cT ∂~u

∂p−~uTQ∼

∂~u∂p− ∂~uT

∂pQ∼~u)

Since the matrix Q∼ is symmetric, then(Q∼∂~u∂p

)T

=∂~uT

∂pQ∼,

in which case ∂J/∂p reduces to

∂J∂p

=∂~cT

∂p~u− 1

2~uT∂Q∼∂p~u +

(Replace with~vT A∼︷ ︸︸ ︷~cT −~uTQ∼

)∂~u∂p.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem

Let p denote any of the parameters aij, bi, cj, or qij, where qij

denotes the i, j entry of the matrix Q∼.Differentiate the objective function, wrt parameter p:

∂J∂p

=∂~cT

∂p~u− 1

2~uT∂Q∼∂p~u +

12

(2~cT ∂~u

∂p−~uTQ∼

∂~u∂p− ∂~uT

∂pQ∼~u)

Since the matrix Q∼ is symmetric, then(Q∼∂~u∂p

)T

=∂~uT

∂pQ∼,

in which case ∂J/∂p reduces to

∂J∂p

=∂~cT

∂p~u− 1

2~uT∂Q∼∂p~u +

(Replace with~vT A∼︷ ︸︸ ︷~cT −~uTQ∼

)∂~u∂p.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem

Let p denote any of the parameters aij, bi, cj, or qij, where qij

denotes the i, j entry of the matrix Q∼.Differentiate the objective function, wrt parameter p:

∂J∂p

=∂~cT

∂p~u− 1

2~uT∂Q∼∂p~u +

12

(2~cT ∂~u

∂p−~uTQ∼

∂~u∂p− ∂~uT

∂pQ∼~u)

Since the matrix Q∼ is symmetric, then(Q∼∂~u∂p

)T

=∂~uT

∂pQ∼,

in which case ∂J/∂p reduces to

∂J∂p

=∂~cT

∂p~u− 1

2~uT∂Q∼∂p~u +

(Replace with~vT A∼︷ ︸︸ ︷~cT −~uTQ∼

)∂~u∂p.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem

Let p denote any of the parameters aij, bi, cj, or qij, where qij

denotes the i, j entry of the matrix Q∼.Differentiate the objective function, wrt parameter p:

∂J∂p

=∂~cT

∂p~u− 1

2~uT∂Q∼∂p~u +

12

(2~cT ∂~u

∂p−~uTQ∼

∂~u∂p− ∂~uT

∂pQ∼~u)

Since the matrix Q∼ is symmetric, then(Q∼∂~u∂p

)T

=∂~uT

∂pQ∼,

in which case ∂J/∂p reduces to

∂J∂p

=∂~cT

∂p~u− 1

2~uT∂Q∼∂p~u +

(Replace with~vT A∼︷ ︸︸ ︷~cT −~uTQ∼

)∂~u∂p.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem

The expression ∂~u/∂p will be replaced by an expressioncontaining the forward and adjoint solutions.This expression is found by differentiating the forward problemA∼~u +

((s1)2 (s2)2 · · · (sm)2

)T = ~b to get

A∼∂~u∂p

+∂A∼∂p~u + 2

(s1∂s1∂p s2

∂s2∂p · · · sm

∂sm∂p

)T=∂~b∂p.

Next, premultiply this result by the adjoint solution~vT and usethe orthogonality conditions visi = 0 to get

~vTA∼∂~u∂p

= ~vT

(∂~b∂p−∂A∼∂p~u

)in which case

∂J∂p

=∂~cT

∂p~u− 1

2~uT∂Q∼∂p~u +~vT

(∂~b∂p−∂A∼∂p~u

)Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem

The expression ∂~u/∂p will be replaced by an expressioncontaining the forward and adjoint solutions.This expression is found by differentiating the forward problemA∼~u +

((s1)2 (s2)2 · · · (sm)2

)T = ~b to get

A∼∂~u∂p

+∂A∼∂p~u + 2

(s1∂s1∂p s2

∂s2∂p · · · sm

∂sm∂p

)T=∂~b∂p.

Next, premultiply this result by the adjoint solution~vT and usethe orthogonality conditions visi = 0 to get

~vTA∼∂~u∂p

= ~vT

(∂~b∂p−∂A∼∂p~u

)in which case

∂J∂p

=∂~cT

∂p~u− 1

2~uT∂Q∼∂p~u +~vT

(∂~b∂p−∂A∼∂p~u

)Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem

The expression ∂~u/∂p will be replaced by an expressioncontaining the forward and adjoint solutions.This expression is found by differentiating the forward problemA∼~u +

((s1)2 (s2)2 · · · (sm)2

)T = ~b to get

A∼∂~u∂p

+∂A∼∂p~u + 2

(s1∂s1∂p s2

∂s2∂p · · · sm

∂sm∂p

)T=∂~b∂p.

Next, premultiply this result by the adjoint solution~vT and usethe orthogonality conditions visi = 0 to get

~vTA∼∂~u∂p

= ~vT

(∂~b∂p−∂A∼∂p~u

)in which case

∂J∂p

=∂~cT

∂p~u− 1

2~uT∂Q∼∂p~u +~vT

(∂~b∂p−∂A∼∂p~u

)Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem

The expression ∂~u/∂p will be replaced by an expressioncontaining the forward and adjoint solutions.This expression is found by differentiating the forward problemA∼~u +

((s1)2 (s2)2 · · · (sm)2

)T = ~b to get

A∼∂~u∂p

+∂A∼∂p~u + 2

(s1∂s1∂p s2

∂s2∂p · · · sm

∂sm∂p

)T=∂~b∂p.

Next, premultiply this result by the adjoint solution~vT and usethe orthogonality conditions visi = 0 to get

~vTA∼∂~u∂p

= ~vT

(∂~b∂p−∂A∼∂p~u

)in which case

∂J∂p

=∂~cT

∂p~u− 1

2~uT∂Q∼∂p~u +~vT

(∂~b∂p−∂A∼∂p~u

)Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem–Summary

Forward problem:

A∼~u +

(s1)2

(s2)2

...(sm)2

= ~b

Mixed nonhomogenous adjoint problem:

A∼T~v = ~c− Q∼~u

Derivative of the objective functional:

∂J∂p

=∂~cT

∂p~u− 1

2~uT∂Q∼∂p~u +~vT

(∂~b∂p−∂A∼∂p~u

)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem–Summary

Forward problem:

A∼~u +

(s1)2

(s2)2

...(sm)2

= ~b

Mixed nonhomogenous adjoint problem:

A∼T~v = ~c− Q∼~u

Derivative of the objective functional:

∂J∂p

=∂~cT

∂p~u− 1

2~uT∂Q∼∂p~u +~vT

(∂~b∂p−∂A∼∂p~u

)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Quadratic Programming Problem–Summary

Forward problem:

A∼~u +

(s1)2

(s2)2

...(sm)2

= ~b

Mixed nonhomogenous adjoint problem:

A∼T~v = ~c− Q∼~u

Derivative of the objective functional:

∂J∂p

=∂~cT

∂p~u− 1

2~uT∂Q∼∂p~u +~vT

(∂~b∂p−∂A∼∂p~u

)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Algorithmic Differentiation

Annuity Function

f (m, l, r, t) = l

( rm

) (1 + r

m

)m t(1 + r

m

)m t − 1

f returns the fixed periodic payment required to pay off a loanamount of l, made for m periodic payments per year, with annualinterest rate r, and for a total of t years.Loan of $100,000 is taken over 20 years, with annual interest of15%, then the monthly payment is given by

f (12, 100000, .15, 20) = 100000

(.1512

) (1 + .15

12

)12·20(1 + .15

12

)12·20 − 1= 1316.80.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Algorithmic Differentiation

Annuity Function

f (m, l, r, t) = l

( rm

) (1 + r

m

)m t(1 + r

m

)m t − 1

f returns the fixed periodic payment required to pay off a loanamount of l, made for m periodic payments per year, with annualinterest rate r, and for a total of t years.Loan of $100,000 is taken over 20 years, with annual interest of15%, then the monthly payment is given by

f (12, 100000, .15, 20) = 100000

(.1512

) (1 + .15

12

)12·20(1 + .15

12

)12·20 − 1= 1316.80.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Algorithmic Differentiation

Annuity Function

f (m, l, r, t) = l

( rm

) (1 + r

m

)m t(1 + r

m

)m t − 1

f returns the fixed periodic payment required to pay off a loanamount of l, made for m periodic payments per year, with annualinterest rate r, and for a total of t years.Loan of $100,000 is taken over 20 years, with annual interest of15%, then the monthly payment is given by

f (12, 100000, .15, 20) = 100000

(.1512

) (1 + .15

12

)12·20(1 + .15

12

)12·20 − 1= 1316.80.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Algorithmic Differentiation

Annuity Function

f (m, l, r, t) = l

( rm

) (1 + r

m

)m t(1 + r

m

)m t − 1

f returns the fixed periodic payment required to pay off a loanamount of l, made for m periodic payments per year, with annualinterest rate r, and for a total of t years.Loan of $100,000 is taken over 20 years, with annual interest of15%, then the monthly payment is given by

f (12, 100000, .15, 20) = 100000

(.1512

) (1 + .15

12

)12·20(1 + .15

12

)12·20 − 1= 1316.80.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Forward Evaluation Mode

Input variables are

p1 := m = 12.0p2 := l = 100000.0p3 := r = 0.15p4 := t = 20.0.

Intermediate variables

u1 := p3/p1 = 0.0125 r/mu2 := 1 + u1 = 1.0125 1 + r/mu3 := p1p4 = 240 m · tu4 := u1u2

u3 = 0.2464 (r/m)(1 + r/m)m·t

u5 := u2u3 − 1 = 18.715 (1 + r/m)m·t − 1

u6 := u4/u5 = 0.01368u7 := u2u6 = 1316.79 Payment

Output variable u = u7 = 1316.79Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Forward Evaluation Mode

Input variables are

p1 := m = 12.0p2 := l = 100000.0p3 := r = 0.15p4 := t = 20.0.

Intermediate variables

u1 := p3/p1 = 0.0125 r/mu2 := 1 + u1 = 1.0125 1 + r/mu3 := p1p4 = 240 m · tu4 := u1u2

u3 = 0.2464 (r/m)(1 + r/m)m·t

u5 := u2u3 − 1 = 18.715 (1 + r/m)m·t − 1

u6 := u4/u5 = 0.01368u7 := u2u6 = 1316.79 Payment

Output variable u = u7 = 1316.79Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Forward Evaluation Mode

Input variables are

p1 := m = 12.0p2 := l = 100000.0p3 := r = 0.15p4 := t = 20.0.

Intermediate variables

u1 := p3/p1 = 0.0125 r/mu2 := 1 + u1 = 1.0125 1 + r/mu3 := p1p4 = 240 m · tu4 := u1u2

u3 = 0.2464 (r/m)(1 + r/m)m·t

u5 := u2u3 − 1 = 18.715 (1 + r/m)m·t − 1

u6 := u4/u5 = 0.01368u7 := u2u6 = 1316.79 Payment

Output variable u = u7 = 1316.79Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Forward Evaluation Mode

Deterministic algorithm can be represented in a graphical format.

p2

p4

p3

p1

u3u1

u2 u4

u5

u6

u7

u

1

Abstract directed graph consists of two parts:Vertices represent the “objects”Directed edges represent ”relationships”

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Forward Evaluation Mode

Deterministic algorithm can be represented in a graphical format.

p2

p4

p3

p1

u3u1

u2 u4

u5

u6

u7

u

1

Abstract directed graph consists of two parts:Vertices represent the “objects”Directed edges represent ”relationships”

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Forward Evaluation Mode

Deterministic algorithm can be represented in a graphical format.

p2

p4

p3

p1

u3u1

u2 u4

u5

u6

u7

u

1

Abstract directed graph consists of two parts:Vertices represent the “objects”Directed edges represent ”relationships”

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Forward Evaluation Mode

Deterministic algorithm can be represented in a graphical format.

p2

p4

p3

p1

u3u1

u2 u4

u5

u6

u7

u

1

Abstract directed graph consists of two parts:Vertices represent the “objects”Directed edges represent ”relationships”

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Forward Mode

How does the payment change wrt changes in the interest rate?

p2

p4

p3

p1

u3u1

u2 u4

u5

u6

u7

u

1

dudp3

=

p3→u1→u4→u6→u7→u︷ ︸︸ ︷∂u∂u7

∂u7

∂u6

∂u6

∂u4

∂u4

∂u1

du1

dp3+

p3→u1→u2→u4→u6→u7→u︷ ︸︸ ︷∂u∂u

∂u∂u7

∂u7

∂u6

∂u6

∂u4

∂u4

∂u2

∂u2

∂u1

du1

dp3

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Forward Mode

How does the payment change wrt changes in the interest rate?

p2

p4

p3

p1

u3u1

u2 u4

u5

u6

u7

u

1

dudp3

=

p3→u1→u4→u6→u7→u︷ ︸︸ ︷∂u∂u7

∂u7

∂u6

∂u6

∂u4

∂u4

∂u1

du1

dp3+

p3→u1→u2→u4→u6→u7→u︷ ︸︸ ︷∂u∂u

∂u∂u7

∂u7

∂u6

∂u6

∂u4

∂u4

∂u2

∂u2

∂u1

du1

dp3

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

General Forward Mode

Due to precedence relations

u1 = u1(p)u2 = u2(u1, p)u3 = u3(u2, u1, p)

...

uN = uN(uN−1, uN−2, . . . , u2, u1, p)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

General Forward Mode

Due to precedence relations

u1 = u1(p)

u2 = u2(u1, p)u3 = u3(u2, u1, p)

...

uN = uN(uN−1, uN−2, . . . , u2, u1, p)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

General Forward Mode

Due to precedence relations

u1 = u1(p)u2 = u2(u1, p)

u3 = u3(u2, u1, p)...

uN = uN(uN−1, uN−2, . . . , u2, u1, p)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

General Forward Mode

Due to precedence relations

u1 = u1(p)u2 = u2(u1, p)u3 = u3(u2, u1, p)

...

uN = uN(uN−1, uN−2, . . . , u2, u1, p)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

General Forward Mode

Due to precedence relations

u1 = u1(p)u2 = u2(u1, p)u3 = u3(u2, u1, p)

...

uN = uN(uN−1, uN−2, . . . , u2, u1, p)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

General Forward Mode

Due to precedence relations

u1 = u1(p)u2 = u2(u1, p)u3 = u3(u2, u1, p)

...

uN = uN(uN−1, uN−2, . . . , u2, u1, p)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

General Forward Sensitivity Mode

du1

dp=

∂u1

∂p

du2

dp=

∂u2

∂u1

du1

dp+∂u2

∂pdu3

dp=

∂u3

∂u1

du1

dp+∂u3

∂u2

du2

dp+∂u3

∂p...

duN

dp=

∂uN

∂u1

du1

dp+∂uN

∂u2

du2

dp+ · · ·+ ∂uN

∂uN−1

duN−1

dp+∂uN

∂p

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

General Forward Sensitivity Mode

du1

dp=

∂u1

∂pdu2

dp=

∂u2

∂u1

du1

dp+∂u2

∂p

du3

dp=

∂u3

∂u1

du1

dp+∂u3

∂u2

du2

dp+∂u3

∂p...

duN

dp=

∂uN

∂u1

du1

dp+∂uN

∂u2

du2

dp+ · · ·+ ∂uN

∂uN−1

duN−1

dp+∂uN

∂p

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

General Forward Sensitivity Mode

du1

dp=

∂u1

∂pdu2

dp=

∂u2

∂u1

du1

dp+∂u2

∂pdu3

dp=

∂u3

∂u1

du1

dp+∂u3

∂u2

du2

dp+∂u3

∂p

...duN

dp=

∂uN

∂u1

du1

dp+∂uN

∂u2

du2

dp+ · · ·+ ∂uN

∂uN−1

duN−1

dp+∂uN

∂p

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

General Forward Sensitivity Mode

du1

dp=

∂u1

∂pdu2

dp=

∂u2

∂u1

du1

dp+∂u2

∂pdu3

dp=

∂u3

∂u1

du1

dp+∂u3

∂u2

du2

dp+∂u3

∂p...

duN

dp=

∂uN

∂u1

du1

dp+∂uN

∂u2

du2

dp+ · · ·+ ∂uN

∂uN−1

duN−1

dp+∂uN

∂p

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

General Forward Sensitivity Mode

du1

dp=

∂u1

∂pdu2

dp=

∂u2

∂u1

du1

dp+∂u2

∂pdu3

dp=

∂u3

∂u1

du1

dp+∂u3

∂u2

du2

dp+∂u3

∂p...

duN

dp=

∂uN

∂u1

du1

dp+∂uN

∂u2

du2

dp+ · · ·+ ∂uN

∂uN−1

duN−1

dp+∂uN

∂p

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

General Forward Sensitivity Mode

This linear system can be written in the more concise form(D∼[~u]− 2I∼

) d~udp

= −∂~u∂p

where

D∼ [~u] =

1 0 · · · 0∂u2

∂u11 0 · · · 0

∂u3

∂u1

∂u3

∂u21 0 · · · 0

......

. . ....

∂uN

∂u1

∂uN

∂u2· · · ∂uN

∂uN−11

and ~u =

u1u2...

uN

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Reverse Mode

p2

p4

p3

p1

u3u1

u2 u4

u5

u6

u7

u

1

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Reverse Sensitivity Mode

p3 u1

u2 u4

u5

u6

u7

u

∂u

∂u7= 1

∂u

∂u6=

∂u

∂u7

∂u7

∂u6

∂u

∂u5=

∂u

∂u6

∂u6

∂u5

∂u

∂u4=

∂u

∂u6

∂u6

∂u4

∂u

∂u2=

∂u

∂u4

∂u4

∂u2+

∂u

∂u5

∂u5

∂u2

∂u

∂u1=

∂u

∂u2

∂u2

∂u1+

∂u

∂u4

∂u4

∂u1

1

dudp3

=∂u∂u1

∂u1

∂p

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

General Reverse Sensitivity Mode

∂u∂uN

=∂uN

∂uN= 1

∂u∂uN−1

=∂u∂uN

∂uN

∂uN−1

∂u∂uN−2

=∂u

∂uN−1

∂uN−1

∂uN−2+

∂u∂uN

∂uN

∂uN−2

∂u∂uN−3

=∂u

∂uN−2

∂uN−2

∂uN−3+

∂u∂uN−1

∂uN−1

∂uN−3+

∂u∂uN

∂uN

∂uN−3...

∂u∂u1

=∂u∂u2

∂u2

∂u1+

∂u∂u3

∂u3

∂u1+ · · ·+ ∂u

∂uN

∂uN

∂u1

dudp

=N∑

i=1

∂u∂ui

∂ui

∂p

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

General Reverse Sensitivity Mode

∂u∂uN

=∂uN

∂uN= 1

∂u∂uN−1

=∂u∂uN

∂uN

∂uN−1

∂u∂uN−2

=∂u

∂uN−1

∂uN−1

∂uN−2+

∂u∂uN

∂uN

∂uN−2

∂u∂uN−3

=∂u

∂uN−2

∂uN−2

∂uN−3+

∂u∂uN−1

∂uN−1

∂uN−3+

∂u∂uN

∂uN

∂uN−3...

∂u∂u1

=∂u∂u2

∂u2

∂u1+

∂u∂u3

∂u3

∂u1+ · · ·+ ∂u

∂uN

∂uN

∂u1

dudp

=N∑

i=1

∂u∂ui

∂ui

∂p

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

General Reverse Sensitivity Mode

∂u∂uN

=∂uN

∂uN= 1

∂u∂uN−1

=∂u∂uN

∂uN

∂uN−1

∂u∂uN−2

=∂u

∂uN−1

∂uN−1

∂uN−2+

∂u∂uN

∂uN

∂uN−2

∂u∂uN−3

=∂u

∂uN−2

∂uN−2

∂uN−3+

∂u∂uN−1

∂uN−1

∂uN−3+

∂u∂uN

∂uN

∂uN−3...

∂u∂u1

=∂u∂u2

∂u2

∂u1+

∂u∂u3

∂u3

∂u1+ · · ·+ ∂u

∂uN

∂uN

∂u1

dudp

=N∑

i=1

∂u∂ui

∂ui

∂p

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

General Reverse Sensitivity Mode

∂u∂uN

=∂uN

∂uN= 1

∂u∂uN−1

=∂u∂uN

∂uN

∂uN−1

∂u∂uN−2

=∂u

∂uN−1

∂uN−1

∂uN−2+

∂u∂uN

∂uN

∂uN−2

∂u∂uN−3

=∂u

∂uN−2

∂uN−2

∂uN−3+

∂u∂uN−1

∂uN−1

∂uN−3+

∂u∂uN

∂uN

∂uN−3

...∂u∂u1

=∂u∂u2

∂u2

∂u1+

∂u∂u3

∂u3

∂u1+ · · ·+ ∂u

∂uN

∂uN

∂u1

dudp

=N∑

i=1

∂u∂ui

∂ui

∂p

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

General Reverse Sensitivity Mode

∂u∂uN

=∂uN

∂uN= 1

∂u∂uN−1

=∂u∂uN

∂uN

∂uN−1

∂u∂uN−2

=∂u

∂uN−1

∂uN−1

∂uN−2+

∂u∂uN

∂uN

∂uN−2

∂u∂uN−3

=∂u

∂uN−2

∂uN−2

∂uN−3+

∂u∂uN−1

∂uN−1

∂uN−3+

∂u∂uN

∂uN

∂uN−3...

∂u∂u1

=∂u∂u2

∂u2

∂u1+

∂u∂u3

∂u3

∂u1+ · · ·+ ∂u

∂uN

∂uN

∂u1

dudp

=N∑

i=1

∂u∂ui

∂ui

∂p

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

General Reverse Sensitivity Mode

∂u∂uN

=∂uN

∂uN= 1

∂u∂uN−1

=∂u∂uN

∂uN

∂uN−1

∂u∂uN−2

=∂u

∂uN−1

∂uN−1

∂uN−2+

∂u∂uN

∂uN

∂uN−2

∂u∂uN−3

=∂u

∂uN−2

∂uN−2

∂uN−3+

∂u∂uN−1

∂uN−1

∂uN−3+

∂u∂uN

∂uN

∂uN−3...

∂u∂u1

=∂u∂u2

∂u2

∂u1+

∂u∂u3

∂u3

∂u1+ · · ·+ ∂u

∂uN

∂uN

∂u1

dudp

=N∑

i=1

∂u∂ui

∂ui

∂p

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

General Reverse Sensitivity Mode

∂u∂uN

=∂uN

∂uN= 1

∂u∂uN−1

=∂u∂uN

∂uN

∂uN−1

∂u∂uN−2

=∂u

∂uN−1

∂uN−1

∂uN−2+

∂u∂uN

∂uN

∂uN−2

∂u∂uN−3

=∂u

∂uN−2

∂uN−2

∂uN−3+

∂u∂uN−1

∂uN−1

∂uN−3+

∂u∂uN

∂uN

∂uN−3...

∂u∂u1

=∂u∂u2

∂u2

∂u1+

∂u∂u3

∂u3

∂u1+ · · ·+ ∂u

∂uN

∂uN

∂u1

dudp

=N∑

i=1

∂u∂ui

∂ui

∂p

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Consider the right eigenvalue problem

A∼~u = λ~u

Assume that the eigenvalues λk, for k = 1, . . . , n are distinct.

Hence we have n linearly independent eigenvectors ~uk.

Input parameters p ∈ {aij}Outputs: λi,~ui

FSEs

A∼∂~u∂p

+∂A∼∂p~u = λ

∂~u∂p

+∂λ

∂p~u

This equation has two unknowns ∂λ/∂aij and ∂~u/∂aij

Either obtain another independent equation oreliminate one of the unknown variables from this equation.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Consider the right eigenvalue problem

A∼~u = λ~u

Assume that the eigenvalues λk, for k = 1, . . . , n are distinct.

Hence we have n linearly independent eigenvectors ~uk.

Input parameters p ∈ {aij}Outputs: λi,~ui

FSEs

A∼∂~u∂p

+∂A∼∂p~u = λ

∂~u∂p

+∂λ

∂p~u

This equation has two unknowns ∂λ/∂aij and ∂~u/∂aij

Either obtain another independent equation oreliminate one of the unknown variables from this equation.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Consider the right eigenvalue problem

A∼~u = λ~u

Assume that the eigenvalues λk, for k = 1, . . . , n are distinct.

Hence we have n linearly independent eigenvectors ~uk.

Input parameters p ∈ {aij}Outputs: λi,~ui

FSEs

A∼∂~u∂p

+∂A∼∂p~u = λ

∂~u∂p

+∂λ

∂p~u

This equation has two unknowns ∂λ/∂aij and ∂~u/∂aij

Either obtain another independent equation oreliminate one of the unknown variables from this equation.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Consider the right eigenvalue problem

A∼~u = λ~u

Assume that the eigenvalues λk, for k = 1, . . . , n are distinct.

Hence we have n linearly independent eigenvectors ~uk.

Input parameters p ∈ {aij}Outputs: λi,~ui

FSEs

A∼∂~u∂p

+∂A∼∂p~u = λ

∂~u∂p

+∂λ

∂p~u

This equation has two unknowns ∂λ/∂aij and ∂~u/∂aij

Either obtain another independent equation oreliminate one of the unknown variables from this equation.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Consider the right eigenvalue problem

A∼~u = λ~u

Assume that the eigenvalues λk, for k = 1, . . . , n are distinct.

Hence we have n linearly independent eigenvectors ~uk.

Input parameters p ∈ {aij}Outputs: λi,~ui

FSEs

A∼∂~u∂p

+∂A∼∂p~u = λ

∂~u∂p

+∂λ

∂p~u

This equation has two unknowns ∂λ/∂aij and ∂~u/∂aij

Either obtain another independent equation oreliminate one of the unknown variables from this equation.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Consider the right eigenvalue problem

A∼~u = λ~u

Assume that the eigenvalues λk, for k = 1, . . . , n are distinct.

Hence we have n linearly independent eigenvectors ~uk.

Input parameters p ∈ {aij}Outputs: λi,~ui

FSEs

A∼∂~u∂p

+∂A∼∂p~u = λ

∂~u∂p

+∂λ

∂p~u

This equation has two unknowns ∂λ/∂aij and ∂~u/∂aij

Either obtain another independent equation oreliminate one of the unknown variables from this equation.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Consider the right eigenvalue problem

A∼~u = λ~u

Assume that the eigenvalues λk, for k = 1, . . . , n are distinct.

Hence we have n linearly independent eigenvectors ~uk.

Input parameters p ∈ {aij}Outputs: λi,~ui

FSEs

A∼∂~u∂p

+∂A∼∂p~u = λ

∂~u∂p

+∂λ

∂p~u

This equation has two unknowns ∂λ/∂aij and ∂~u/∂aij

Either obtain another independent equation oreliminate one of the unknown variables from this equation.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Consider the right eigenvalue problem

A∼~u = λ~u

Assume that the eigenvalues λk, for k = 1, . . . , n are distinct.

Hence we have n linearly independent eigenvectors ~uk.

Input parameters p ∈ {aij}Outputs: λi,~ui

FSEs

A∼∂~u∂p

+∂A∼∂p~u = λ

∂~u∂p

+∂λ

∂p~u

This equation has two unknowns ∂λ/∂aij and ∂~u/∂aij

Either obtain another independent equation oreliminate one of the unknown variables from this equation.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Choosing the second strategy, let~v be some nonzero, as yetunspecified, vector and take the dot product

~vTA∼∂~u∂aij

+~vT∂A∼∂aij

~u = ~vTλ∂~u∂aij

+~vT ∂λ

∂aij~u

Rearranging this equation and writing using the inner productnotation < ~a,~b >= ~bT ·~a, we find

∂λ

∂aij〈~u,~v〉 =

⟨∂A∼∂aij

~u,~v

⟩+⟨(

A∼− λI∼) ∂~u∂aij

,~v⟩

Since Because(

A∼− λI∼)T

= A∼T − λI∼, we can use the Lagrange

identity for matrices under the usual inner product⟨(A∼− λI∼

) ∂~u∂aij

,~v⟩

=⟨∂~u∂aij

,(

A∼T − λI∼

)~v⟩

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Choosing the second strategy, let~v be some nonzero, as yetunspecified, vector and take the dot product

~vTA∼∂~u∂aij

+~vT∂A∼∂aij

~u = ~vTλ∂~u∂aij

+~vT ∂λ

∂aij~u

Rearranging this equation and writing using the inner productnotation < ~a,~b >= ~bT ·~a, we find

∂λ

∂aij〈~u,~v〉 =

⟨∂A∼∂aij

~u,~v

⟩+⟨(

A∼− λI∼) ∂~u∂aij

,~v⟩

Since Because(

A∼− λI∼)T

= A∼T − λI∼, we can use the Lagrange

identity for matrices under the usual inner product⟨(A∼− λI∼

) ∂~u∂aij

,~v⟩

=⟨∂~u∂aij

,(

A∼T − λI∼

)~v⟩

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Choosing the second strategy, let~v be some nonzero, as yetunspecified, vector and take the dot product

~vTA∼∂~u∂aij

+~vT∂A∼∂aij

~u = ~vTλ∂~u∂aij

+~vT ∂λ

∂aij~u

Rearranging this equation and writing using the inner productnotation < ~a,~b >= ~bT ·~a, we find

∂λ

∂aij〈~u,~v〉 =

⟨∂A∼∂aij

~u,~v

⟩+⟨(

A∼− λI∼) ∂~u∂aij

,~v⟩

Since Because(

A∼− λI∼)T

= A∼T − λI∼, we can use the Lagrange

identity for matrices under the usual inner product⟨(A∼− λI∼

) ∂~u∂aij

,~v⟩

=⟨∂~u∂aij

,(

A∼T − λI∼

)~v⟩

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Compare

∂λ

∂aij〈~u,~v〉 =

⟨∂A∼∂aij

~u,~v

⟩+⟨(

A∼− λI∼) ∂~u∂aij

,~v⟩

with

∂λ

∂aij〈~u,~v〉 =

⟨∂A∼∂aij

~u,~v

⟩+⟨∂~u∂aij

,(

A∼T − λI∼

)~v⟩

Annihilate the second inner product by forcing the condition

A∼T − λI∼ = 0∼

Adjoint problem is left eigenvalue problem

A∼T~v = λ~v

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Compare

∂λ

∂aij〈~u,~v〉 =

⟨∂A∼∂aij

~u,~v

⟩+⟨(

A∼− λI∼) ∂~u∂aij

,~v⟩

with

∂λ

∂aij〈~u,~v〉 =

⟨∂A∼∂aij

~u,~v

⟩+⟨∂~u∂aij

,(

A∼T − λI∼

)~v⟩

Annihilate the second inner product by forcing the condition

A∼T − λI∼ = 0∼

Adjoint problem is left eigenvalue problem

A∼T~v = λ~v

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Compare

∂λ

∂aij〈~u,~v〉 =

⟨∂A∼∂aij

~u,~v

⟩+⟨(

A∼− λI∼) ∂~u∂aij

,~v⟩

with

∂λ

∂aij〈~u,~v〉 =

⟨∂A∼∂aij

~u,~v

⟩+⟨∂~u∂aij

,(

A∼T − λI∼

)~v⟩

Annihilate the second inner product by forcing the condition

A∼T − λI∼ = 0∼

Adjoint problem is left eigenvalue problem

A∼T~v = λ~v

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Compare

∂λ

∂aij〈~u,~v〉 =

⟨∂A∼∂aij

~u,~v

⟩+⟨(

A∼− λI∼) ∂~u∂aij

,~v⟩

with

∂λ

∂aij〈~u,~v〉 =

⟨∂A∼∂aij

~u,~v

⟩+⟨∂~u∂aij

,(

A∼T − λI∼

)~v⟩

Annihilate the second inner product by forcing the condition

A∼T − λI∼ = 0∼

Adjoint problem is left eigenvalue problem

A∼T~v = λ~v

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Now

∂λ

∂aij〈~u,~v〉 =

⟨∂A∼∂aij

~u,~v

⟩+⟨∂~u∂aij

,(

A∼T − λI∼

)~v⟩

reduces to

∂λ

∂aij〈~u,~v〉 =

⟨∂A∼∂aij

~u,~v

⟩= ujvi

For the kth right & left eigenvalue problems

A∼~uk = λk~uk and A∼T~vk = λk~vk

It can be shown that〈~uk,~vk〉 6= 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Now

∂λ

∂aij〈~u,~v〉 =

⟨∂A∼∂aij

~u,~v

⟩+⟨∂~u∂aij

,(

A∼T − λI∼

)~v⟩

reduces to

∂λ

∂aij〈~u,~v〉 =

⟨∂A∼∂aij

~u,~v

⟩= ujvi

For the kth right & left eigenvalue problems

A∼~uk = λk~uk and A∼T~vk = λk~vk

It can be shown that〈~uk,~vk〉 6= 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Now

∂λ

∂aij〈~u,~v〉 =

⟨∂A∼∂aij

~u,~v

⟩+⟨∂~u∂aij

,(

A∼T − λI∼

)~v⟩

reduces to

∂λ

∂aij〈~u,~v〉 =

⟨∂A∼∂aij

~u,~v

⟩= ujvi

For the kth right & left eigenvalue problems

A∼~uk = λk~uk and A∼T~vk = λk~vk

It can be shown that〈~uk,~vk〉 6= 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Now

∂λ

∂aij〈~u,~v〉 =

⟨∂A∼∂aij

~u,~v

⟩+⟨∂~u∂aij

,(

A∼T − λI∼

)~v⟩

reduces to

∂λ

∂aij〈~u,~v〉 =

⟨∂A∼∂aij

~u,~v

⟩= ujvi

For the kth right & left eigenvalue problems

A∼~uk = λk~uk and A∼T~vk = λk~vk

It can be shown that〈~uk,~vk〉 6= 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Right eigenvalue problem (forward problem)

A∼~uk = λk~uk

Derivative of the eigenvalue:

∂λk

∂aij=

(k)uj(k)vi

〈~uk,~vk〉

Associated left eigenvalue problem (adjoint problem)

A∼T~vk = λk~vk

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Right eigenvalue problem (forward problem)

A∼~uk = λk~uk

Derivative of the eigenvalue:

∂λk

∂aij=

(k)uj(k)vi

〈~uk,~vk〉

Associated left eigenvalue problem (adjoint problem)

A∼T~vk = λk~vk

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Right eigenvalue problem (forward problem)

A∼~uk = λk~uk

Derivative of the eigenvalue:

∂λk

∂aij=

(k)uj(k)vi

〈~uk,~vk〉

Associated left eigenvalue problem (adjoint problem)

A∼T~vk = λk~vk

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Next, we determine ∂~u/∂aij.Normalize the right eigenvectors

〈~uk,~uk〉 = 1.

Fix the indexes i, j and differentiating this condition gives

~uTk∂~uk

∂aij+∂~uT

k∂aij

~uk = 0.

Now use the identity ~aT ·~b = ~bT ·~a∂~uT

k∂aij

~uk = ~uTk∂~uk

∂aij,

which gives the result that ~uk and ∂~uk/∂aij are orthogonal, i.e.,⟨∂~uk

∂aij,~uk

⟩= 0, for k = 1, . . . n

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Next, we determine ∂~u/∂aij.Normalize the right eigenvectors

〈~uk,~uk〉 = 1.

Fix the indexes i, j and differentiating this condition gives

~uTk∂~uk

∂aij+∂~uT

k∂aij

~uk = 0.

Now use the identity ~aT ·~b = ~bT ·~a∂~uT

k∂aij

~uk = ~uTk∂~uk

∂aij,

which gives the result that ~uk and ∂~uk/∂aij are orthogonal, i.e.,⟨∂~uk

∂aij,~uk

⟩= 0, for k = 1, . . . n

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Next, we determine ∂~u/∂aij.Normalize the right eigenvectors

〈~uk,~uk〉 = 1.

Fix the indexes i, j and differentiating this condition gives

~uTk∂~uk

∂aij+∂~uT

k∂aij

~uk = 0.

Now use the identity ~aT ·~b = ~bT ·~a∂~uT

k∂aij

~uk = ~uTk∂~uk

∂aij,

which gives the result that ~uk and ∂~uk/∂aij are orthogonal, i.e.,⟨∂~uk

∂aij,~uk

⟩= 0, for k = 1, . . . n

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Next, we determine ∂~u/∂aij.Normalize the right eigenvectors

〈~uk,~uk〉 = 1.

Fix the indexes i, j and differentiating this condition gives

~uTk∂~uk

∂aij+∂~uT

k∂aij

~uk = 0.

Now use the identity ~aT ·~b = ~bT ·~a∂~uT

k∂aij

~uk = ~uTk∂~uk

∂aij,

which gives the result that ~uk and ∂~uk/∂aij are orthogonal, i.e.,⟨∂~uk

∂aij,~uk

⟩= 0, for k = 1, . . . n

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Next, we determine ∂~u/∂aij.Normalize the right eigenvectors

〈~uk,~uk〉 = 1.

Fix the indexes i, j and differentiating this condition gives

~uTk∂~uk

∂aij+∂~uT

k∂aij

~uk = 0.

Now use the identity ~aT ·~b = ~bT ·~a∂~uT

k∂aij

~uk = ~uTk∂~uk

∂aij,

which gives the result that ~uk and ∂~uk/∂aij are orthogonal, i.e.,⟨∂~uk

∂aij,~uk

⟩= 0, for k = 1, . . . n

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

FSE:

A∼∂~uk

∂aij+∂A∼∂aij

~uk = λk∂~uk

∂aij+∂λk

∂aij~uk

Premultiply by ~uTk and using the orthogonality condition gives

~uTk A∼∂~uk

∂aij+~uT

k

∂A∼∂aij

~uk =∂λk

∂aij

Using the inner product notation⟨A∼∂~uk

∂aij,~uk

⟩=∂λk

∂aij−

⟨∂A∼∂aij

~uk,~uk

⟩, for k = 1, . . . ,N

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

FSE:

A∼∂~uk

∂aij+∂A∼∂aij

~uk = λk∂~uk

∂aij+∂λk

∂aij~uk

Premultiply by ~uTk and using the orthogonality condition gives

~uTk A∼∂~uk

∂aij+~uT

k

∂A∼∂aij

~uk =∂λk

∂aij

Using the inner product notation⟨A∼∂~uk

∂aij,~uk

⟩=∂λk

∂aij−

⟨∂A∼∂aij

~uk,~uk

⟩, for k = 1, . . . ,N

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

FSE:

A∼∂~uk

∂aij+∂A∼∂aij

~uk = λk∂~uk

∂aij+∂λk

∂aij~uk

Premultiply by ~uTk and using the orthogonality condition gives

~uTk A∼∂~uk

∂aij+~uT

k

∂A∼∂aij

~uk =∂λk

∂aij

Using the inner product notation⟨A∼∂~uk

∂aij,~uk

⟩=∂λk

∂aij−

⟨∂A∼∂aij

~uk,~uk

⟩, for k = 1, . . . ,N

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

To find an explicit expression for ∂~u/∂aij, we must introduceadditional information.

The key to making further progress is to recall that we haveassumed that the N × N matrix A∼ has N distinct eigenvalues, inwhich case there exists a complete set of N eigenvectors.

Any vector in CN can be expressed as a linear combination of thespanning eigenvectors.

Since ∂~u/∂aij is an N × 1 vector, we can write this derivative asa linear combination of the eigenvectors.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

To find an explicit expression for ∂~u/∂aij, we must introduceadditional information.

The key to making further progress is to recall that we haveassumed that the N × N matrix A∼ has N distinct eigenvalues, inwhich case there exists a complete set of N eigenvectors.

Any vector in CN can be expressed as a linear combination of thespanning eigenvectors.

Since ∂~u/∂aij is an N × 1 vector, we can write this derivative asa linear combination of the eigenvectors.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

To find an explicit expression for ∂~u/∂aij, we must introduceadditional information.

The key to making further progress is to recall that we haveassumed that the N × N matrix A∼ has N distinct eigenvalues, inwhich case there exists a complete set of N eigenvectors.

Any vector in CN can be expressed as a linear combination of thespanning eigenvectors.

Since ∂~u/∂aij is an N × 1 vector, we can write this derivative asa linear combination of the eigenvectors.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

To find an explicit expression for ∂~u/∂aij, we must introduceadditional information.

The key to making further progress is to recall that we haveassumed that the N × N matrix A∼ has N distinct eigenvalues, inwhich case there exists a complete set of N eigenvectors.

Any vector in CN can be expressed as a linear combination of thespanning eigenvectors.

Since ∂~u/∂aij is an N × 1 vector, we can write this derivative asa linear combination of the eigenvectors.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Define the eigenvector matrices U∼ and V∼, whose columns are theindividual eigenvectors ~uk and~vk respectively

U∼ :=(~u1 ~u2 · · · ~uN

)& V∼ :=

(~v1 ~v2 · · · ~vN

)Let Λ∼ be the diagonal matrix of eigenvalues λk

Λ∼ :=

λ1 ~0

λ2. . .

~0 λN

Using this notation, the right and left eigenvalue problems can bewritten as

A∼U∼ = U∼Λ∼ and A∼TV∼ = V∼Λ∼

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Define the eigenvector matrices U∼ and V∼, whose columns are theindividual eigenvectors ~uk and~vk respectively

U∼ :=(~u1 ~u2 · · · ~uN

)& V∼ :=

(~v1 ~v2 · · · ~vN

)Let Λ∼ be the diagonal matrix of eigenvalues λk

Λ∼ :=

λ1 ~0

λ2. . .

~0 λN

Using this notation, the right and left eigenvalue problems can bewritten as

A∼U∼ = U∼Λ∼ and A∼TV∼ = V∼Λ∼

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Define the eigenvector matrices U∼ and V∼, whose columns are theindividual eigenvectors ~uk and~vk respectively

U∼ :=(~u1 ~u2 · · · ~uN

)& V∼ :=

(~v1 ~v2 · · · ~vN

)Let Λ∼ be the diagonal matrix of eigenvalues λk

Λ∼ :=

λ1 ~0

λ2. . .

~0 λN

Using this notation, the right and left eigenvalue problems can bewritten as

A∼U∼ = U∼Λ∼ and A∼TV∼ = V∼Λ∼

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Earlier we forced the right and left eigenvectors to benormalized, and therefore the matrix eigenvectors satisfy theidentity

V∼TU∼ = I∼

The derivative of the matrix of eigenvectors can written as alinear combination of the eigenspace

∂U∼∂aij

= U∼C∼

where the coefficient matrix is

C∼ :=

c1

(1) c1(2) c1

(3) · · · c1(N)

c2(1) c2

(2) c2(3) · · · c2

(N)

......

...cN

(1) cN(2) cN

(3) · · · cN(N)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Earlier we forced the right and left eigenvectors to benormalized, and therefore the matrix eigenvectors satisfy theidentity

V∼TU∼ = I∼

The derivative of the matrix of eigenvectors can written as alinear combination of the eigenspace

∂U∼∂aij

= U∼C∼

where the coefficient matrix is

C∼ :=

c1

(1) c1(2) c1

(3) · · · c1(N)

c2(1) c2

(2) c2(3) · · · c2

(N)

......

...cN

(1) cN(2) cN

(3) · · · cN(N)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Earlier we forced the right and left eigenvectors to benormalized, and therefore the matrix eigenvectors satisfy theidentity

V∼TU∼ = I∼

The derivative of the matrix of eigenvectors can written as alinear combination of the eigenspace

∂U∼∂aij

= U∼C∼

where the coefficient matrix is

C∼ :=

c1

(1) c1(2) c1

(3) · · · c1(N)

c2(1) c2

(2) c2(3) · · · c2

(N)

......

...cN

(1) cN(2) cN

(3) · · · cN(N)

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

For a fixed eigenvector ~u(k), the derivative can be expanded asthe sum

∂~u(k)

∂aij= c1

(k)~u(1) + · · ·+ ck(k)~u(k) + · · · cN

(k)~u(N)

Differentiating the right eigenvector matrix equation gives

A∼∂~U∂aij

+∂A∼∂aij

~U = ~U∂Λ∼∂aij

+∂~U∂aij

Λ∼

Rearranging we get

~U[Λ, ~C

]= ~U

∂Λ∼∂aij−∂A∼∂aij

~U

where [·, ] denotes the commutator bracket[Λ, ~C

]:= Λ~C − ~CΛ∼.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

For a fixed eigenvector ~u(k), the derivative can be expanded asthe sum

∂~u(k)

∂aij= c1

(k)~u(1) + · · ·+ ck(k)~u(k) + · · · cN

(k)~u(N)

Differentiating the right eigenvector matrix equation gives

A∼∂~U∂aij

+∂A∼∂aij

~U = ~U∂Λ∼∂aij

+∂~U∂aij

Λ∼

Rearranging we get

~U[Λ, ~C

]= ~U

∂Λ∼∂aij−∂A∼∂aij

~U

where [·, ] denotes the commutator bracket[Λ, ~C

]:= Λ~C − ~CΛ∼.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

For a fixed eigenvector ~u(k), the derivative can be expanded asthe sum

∂~u(k)

∂aij= c1

(k)~u(1) + · · ·+ ck(k)~u(k) + · · · cN

(k)~u(N)

Differentiating the right eigenvector matrix equation gives

A∼∂~U∂aij

+∂A∼∂aij

~U = ~U∂Λ∼∂aij

+∂~U∂aij

Λ∼

Rearranging we get

~U[Λ, ~C

]= ~U

∂Λ∼∂aij−∂A∼∂aij

~U

where [·, ] denotes the commutator bracket[Λ, ~C

]:= Λ~C − ~CΛ∼.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Premultiply by the left eigenvector matrix and use thenormalization condition, this equation reduces to[

Λ, ~C]

=∂Λ∂aij− ~VT

∂A∼∂aij

~U

Expanding the commutator bracket we find that

[Λ, ~C

]=

0 c1

(2)(λ1 − λ2) c1(3)(λ1 − λ3) · · · c1

(N)(λ1 − λN)

c2(1)(λ2 − λ1) 0 c2

(3)(λ2 − λ3) · · · c2(N)(λ2 − λN)

c3(1)(λ3 − λ1) c3

(2)(λ3 − λ2) 0 · · · c3(N)(λ3 − λN)

.

.

.. . .

.

.

.cN

(1)(λN − λ1) cN(2)(λN − λ2) cN

(3)(λN − λ3) · · · 0

Since the right side is known, and because we assumed that theeigenvalues are distinct, we can solve for the off–diagonalcoefficients

cl(m) = − 1

λl − λm

[~VT

∂A∼∂aij

~U

]lm

for l 6= m

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Premultiply by the left eigenvector matrix and use thenormalization condition, this equation reduces to[

Λ, ~C]

=∂Λ∂aij− ~VT

∂A∼∂aij

~U

Expanding the commutator bracket we find that

[Λ, ~C

]=

0 c1

(2)(λ1 − λ2) c1(3)(λ1 − λ3) · · · c1

(N)(λ1 − λN)

c2(1)(λ2 − λ1) 0 c2

(3)(λ2 − λ3) · · · c2(N)(λ2 − λN)

c3(1)(λ3 − λ1) c3

(2)(λ3 − λ2) 0 · · · c3(N)(λ3 − λN)

.

.

.. . .

.

.

.cN

(1)(λN − λ1) cN(2)(λN − λ2) cN

(3)(λN − λ3) · · · 0

Since the right side is known, and because we assumed that theeigenvalues are distinct, we can solve for the off–diagonalcoefficients

cl(m) = − 1

λl − λm

[~VT

∂A∼∂aij

~U

]lm

for l 6= m

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Premultiply by the left eigenvector matrix and use thenormalization condition, this equation reduces to[

Λ, ~C]

=∂Λ∂aij− ~VT

∂A∼∂aij

~U

Expanding the commutator bracket we find that

[Λ, ~C

]=

0 c1

(2)(λ1 − λ2) c1(3)(λ1 − λ3) · · · c1

(N)(λ1 − λN)

c2(1)(λ2 − λ1) 0 c2

(3)(λ2 − λ3) · · · c2(N)(λ2 − λN)

c3(1)(λ3 − λ1) c3

(2)(λ3 − λ2) 0 · · · c3(N)(λ3 − λN)

.

.

.. . .

.

.

.cN

(1)(λN − λ1) cN(2)(λN − λ2) cN

(3)(λN − λ3) · · · 0

Since the right side is known, and because we assumed that theeigenvalues are distinct, we can solve for the off–diagonalcoefficients

cl(m) = − 1

λl − λm

[~VT

∂A∼∂aij

~U

]lm

for l 6= m

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Use the fact that the eigenvectors form a basis for CN .

Need to solve for the scalar diagonal coefficients ck(k)

Using the fact that ~uk and ∂~uk/∂aij are orthogonal

We obtain the equation

c1(k)〈~u(1),~u(k)〉+· · ·+ ck

(k)︸︷︷︸Solve for

〈~u(k),~u(k)〉+· · · cN(k)〈~u(N),~u(k)〉 = 0.

The diagonal coefficients in terms of the known off diagonalcoefficients are

ck(k) = −

N∑i=1i 6=k

ci(k)〈~u(i),~u(k)〉

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Use the fact that the eigenvectors form a basis for CN .

Need to solve for the scalar diagonal coefficients ck(k)

Using the fact that ~uk and ∂~uk/∂aij are orthogonal

We obtain the equation

c1(k)〈~u(1),~u(k)〉+· · ·+ ck

(k)︸︷︷︸Solve for

〈~u(k),~u(k)〉+· · · cN(k)〈~u(N),~u(k)〉 = 0.

The diagonal coefficients in terms of the known off diagonalcoefficients are

ck(k) = −

N∑i=1i 6=k

ci(k)〈~u(i),~u(k)〉

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Use the fact that the eigenvectors form a basis for CN .

Need to solve for the scalar diagonal coefficients ck(k)

Using the fact that ~uk and ∂~uk/∂aij are orthogonal

We obtain the equation

c1(k)〈~u(1),~u(k)〉+· · ·+ ck

(k)︸︷︷︸Solve for

〈~u(k),~u(k)〉+· · · cN(k)〈~u(N),~u(k)〉 = 0.

The diagonal coefficients in terms of the known off diagonalcoefficients are

ck(k) = −

N∑i=1i 6=k

ci(k)〈~u(i),~u(k)〉

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Use the fact that the eigenvectors form a basis for CN .

Need to solve for the scalar diagonal coefficients ck(k)

Using the fact that ~uk and ∂~uk/∂aij are orthogonal

We obtain the equation

c1(k)〈~u(1),~u(k)〉+· · ·+ ck

(k)︸︷︷︸Solve for

〈~u(k),~u(k)〉+· · · cN(k)〈~u(N),~u(k)〉 = 0.

The diagonal coefficients in terms of the known off diagonalcoefficients are

ck(k) = −

N∑i=1i 6=k

ci(k)〈~u(i),~u(k)〉

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Eigenvalue Problem

Use the fact that the eigenvectors form a basis for CN .

Need to solve for the scalar diagonal coefficients ck(k)

Using the fact that ~uk and ∂~uk/∂aij are orthogonal

We obtain the equation

c1(k)〈~u(1),~u(k)〉+· · ·+ ck

(k)︸︷︷︸Solve for

〈~u(k),~u(k)〉+· · · cN(k)〈~u(N),~u(k)〉 = 0.

The diagonal coefficients in terms of the known off diagonalcoefficients are

ck(k) = −

N∑i=1i 6=k

ci(k)〈~u(i),~u(k)〉

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Summary of the Eigenvalue Problem

Forward/Adjoint problems A∼~u = λ~u and A∼T~v = λ~v

Derivative of the eigenvalues

∂λk

∂aij=

(k)uj(k)vi

〈~uk,~vk〉

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Summary of the Eigenvalue Problem

Forward/Adjoint problems A∼~u = λ~u and A∼T~v = λ~v

Derivative of the eigenvalues

∂λk

∂aij=

(k)uj(k)vi

〈~uk,~vk〉

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Summary of the Eigenvalue Problem

Derivative of the eigenvectors

∂U∼∂aij

= U∼C∼

where the off–diagonal coefficients are

cl(m) = −

1

λl − λm

[~VT ∂A∼∂aij

~U

]lm

for l 6= m

and the diagonal coefficients are

ck(k) = −

N∑i=1i 6=k

ci(k)〈~u(i)

,~u(k)〉

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Summary of the Eigenvalue Problem

Derivative of the eigenvectors

∂U∼∂aij

= U∼C∼

where the off–diagonal coefficients are

cl(m) = −

1

λl − λm

[~VT ∂A∼∂aij

~U

]lm

for l 6= m

and the diagonal coefficients are

ck(k) = −

N∑i=1i 6=k

ci(k)〈~u(i)

,~u(k)〉

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Summary of the Eigenvalue Problem

Derivative of the eigenvectors

∂U∼∂aij

= U∼C∼

where the off–diagonal coefficients are

cl(m) = −

1

λl − λm

[~VT ∂A∼∂aij

~U

]lm

for l 6= m

and the diagonal coefficients are

ck(k) = −

N∑i=1i 6=k

ci(k)〈~u(i)

,~u(k)〉

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Dimensionality Reduction

To simplify a mathematical model, where numerous categoriesof variables exist, one would like to be able to identify thosevariables that can be safely eliminated without affecting thevalidity of the model.

In order to not inadvertently eliminate significant variables, onemust identify groups of variables that are highly correlated &have strongly interacting mechanisms.

Data contains errors or noise.

Need to estimate the uncertainty in the correlation betweenvariables.

Uncertainty in the data creates uncertainty in the correlationestimates and ultimately in the reduced model.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Dimensionality Reduction

To simplify a mathematical model, where numerous categoriesof variables exist, one would like to be able to identify thosevariables that can be safely eliminated without affecting thevalidity of the model.

In order to not inadvertently eliminate significant variables, onemust identify groups of variables that are highly correlated &have strongly interacting mechanisms.

Data contains errors or noise.

Need to estimate the uncertainty in the correlation betweenvariables.

Uncertainty in the data creates uncertainty in the correlationestimates and ultimately in the reduced model.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Dimensionality Reduction

To simplify a mathematical model, where numerous categoriesof variables exist, one would like to be able to identify thosevariables that can be safely eliminated without affecting thevalidity of the model.

In order to not inadvertently eliminate significant variables, onemust identify groups of variables that are highly correlated &have strongly interacting mechanisms.

Data contains errors or noise.

Need to estimate the uncertainty in the correlation betweenvariables.

Uncertainty in the data creates uncertainty in the correlationestimates and ultimately in the reduced model.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Dimensionality Reduction

To simplify a mathematical model, where numerous categoriesof variables exist, one would like to be able to identify thosevariables that can be safely eliminated without affecting thevalidity of the model.

In order to not inadvertently eliminate significant variables, onemust identify groups of variables that are highly correlated &have strongly interacting mechanisms.

Data contains errors or noise.

Need to estimate the uncertainty in the correlation betweenvariables.

Uncertainty in the data creates uncertainty in the correlationestimates and ultimately in the reduced model.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Dimensionality Reduction

To simplify a mathematical model, where numerous categoriesof variables exist, one would like to be able to identify thosevariables that can be safely eliminated without affecting thevalidity of the model.

In order to not inadvertently eliminate significant variables, onemust identify groups of variables that are highly correlated &have strongly interacting mechanisms.

Data contains errors or noise.

Need to estimate the uncertainty in the correlation betweenvariables.

Uncertainty in the data creates uncertainty in the correlationestimates and ultimately in the reduced model.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Highly Correlated Data Sets

Consider an imaginary disease for which a specific blood testcan, with absolute certainty, identify whether the patient has ordoes not have this disease.

Suppose that there exists a medication whose sole purpose is totreat this particular disease.

The number of prescriptions for this medication and the positiveblood test results are highly correlated.

Assuming that the examining physician always prescribes thismedication the correlation would in fact be 1.0.

The information contained in these two data sets are redundant.

Since the two data sets are so highly correlated, a projectionfrom a 2–dimensional parameter space to a1–dimensional space would be appropriate.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Highly Correlated Data Sets

Consider an imaginary disease for which a specific blood testcan, with absolute certainty, identify whether the patient has ordoes not have this disease.

Suppose that there exists a medication whose sole purpose is totreat this particular disease.

The number of prescriptions for this medication and the positiveblood test results are highly correlated.

Assuming that the examining physician always prescribes thismedication the correlation would in fact be 1.0.

The information contained in these two data sets are redundant.

Since the two data sets are so highly correlated, a projectionfrom a 2–dimensional parameter space to a1–dimensional space would be appropriate.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Highly Correlated Data Sets

Consider an imaginary disease for which a specific blood testcan, with absolute certainty, identify whether the patient has ordoes not have this disease.

Suppose that there exists a medication whose sole purpose is totreat this particular disease.

The number of prescriptions for this medication and the positiveblood test results are highly correlated.

Assuming that the examining physician always prescribes thismedication the correlation would in fact be 1.0.

The information contained in these two data sets are redundant.

Since the two data sets are so highly correlated, a projectionfrom a 2–dimensional parameter space to a1–dimensional space would be appropriate.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Highly Correlated Data Sets

Consider an imaginary disease for which a specific blood testcan, with absolute certainty, identify whether the patient has ordoes not have this disease.

Suppose that there exists a medication whose sole purpose is totreat this particular disease.

The number of prescriptions for this medication and the positiveblood test results are highly correlated.

Assuming that the examining physician always prescribes thismedication the correlation would in fact be 1.0.

The information contained in these two data sets are redundant.

Since the two data sets are so highly correlated, a projectionfrom a 2–dimensional parameter space to a1–dimensional space would be appropriate.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Highly Correlated Data Sets

Consider an imaginary disease for which a specific blood testcan, with absolute certainty, identify whether the patient has ordoes not have this disease.

Suppose that there exists a medication whose sole purpose is totreat this particular disease.

The number of prescriptions for this medication and the positiveblood test results are highly correlated.

Assuming that the examining physician always prescribes thismedication the correlation would in fact be 1.0.

The information contained in these two data sets are redundant.

Since the two data sets are so highly correlated, a projectionfrom a 2–dimensional parameter space to a1–dimensional space would be appropriate.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Highly Correlated Data Sets

Consider an imaginary disease for which a specific blood testcan, with absolute certainty, identify whether the patient has ordoes not have this disease.

Suppose that there exists a medication whose sole purpose is totreat this particular disease.

The number of prescriptions for this medication and the positiveblood test results are highly correlated.

Assuming that the examining physician always prescribes thismedication the correlation would in fact be 1.0.

The information contained in these two data sets are redundant.

Since the two data sets are so highly correlated, a projectionfrom a 2–dimensional parameter space to a1–dimensional space would be appropriate.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Bio–Syndromic Surveillance

Consider the scenario where public health officials aremonitoring a seasonal outbreak of a disease.Syndromic surveillance/biosurveillance data of clinicalsymptoms such as

fevernumber of hospital admissionsover–the–counter medication consumptionrespiratory complaintsschool or work absences, etc.,

While this data is readily available, it does not directly provideaccurate numerical quantification of the size of the outbreak.Noise in the data causes inaccuracy of any specific numericalassessments or predictions.Symptoms such as fever and respiratory complaints havedifferent levels of correlation for different diseases.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Bio–Syndromic Surveillance

Consider the scenario where public health officials aremonitoring a seasonal outbreak of a disease.Syndromic surveillance/biosurveillance data of clinicalsymptoms such as

fevernumber of hospital admissionsover–the–counter medication consumptionrespiratory complaintsschool or work absences, etc.,

While this data is readily available, it does not directly provideaccurate numerical quantification of the size of the outbreak.Noise in the data causes inaccuracy of any specific numericalassessments or predictions.Symptoms such as fever and respiratory complaints havedifferent levels of correlation for different diseases.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Bio–Syndromic Surveillance

Consider the scenario where public health officials aremonitoring a seasonal outbreak of a disease.Syndromic surveillance/biosurveillance data of clinicalsymptoms such as

fevernumber of hospital admissionsover–the–counter medication consumptionrespiratory complaintsschool or work absences, etc.,

While this data is readily available, it does not directly provideaccurate numerical quantification of the size of the outbreak.Noise in the data causes inaccuracy of any specific numericalassessments or predictions.Symptoms such as fever and respiratory complaints havedifferent levels of correlation for different diseases.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Bio–Syndromic Surveillance

Consider the scenario where public health officials aremonitoring a seasonal outbreak of a disease.Syndromic surveillance/biosurveillance data of clinicalsymptoms such as

fevernumber of hospital admissionsover–the–counter medication consumptionrespiratory complaintsschool or work absences, etc.,

While this data is readily available, it does not directly provideaccurate numerical quantification of the size of the outbreak.Noise in the data causes inaccuracy of any specific numericalassessments or predictions.Symptoms such as fever and respiratory complaints havedifferent levels of correlation for different diseases.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Bio–Syndromic Surveillance

Consider the scenario where public health officials aremonitoring a seasonal outbreak of a disease.Syndromic surveillance/biosurveillance data of clinicalsymptoms such as

fevernumber of hospital admissionsover–the–counter medication consumptionrespiratory complaintsschool or work absences, etc.,

While this data is readily available, it does not directly provideaccurate numerical quantification of the size of the outbreak.Noise in the data causes inaccuracy of any specific numericalassessments or predictions.Symptoms such as fever and respiratory complaints havedifferent levels of correlation for different diseases.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Bio–Syndromic Surveillance

Consider the scenario where public health officials aremonitoring a seasonal outbreak of a disease.Syndromic surveillance/biosurveillance data of clinicalsymptoms such as

fevernumber of hospital admissionsover–the–counter medication consumptionrespiratory complaintsschool or work absences, etc.,

While this data is readily available, it does not directly provideaccurate numerical quantification of the size of the outbreak.Noise in the data causes inaccuracy of any specific numericalassessments or predictions.Symptoms such as fever and respiratory complaints havedifferent levels of correlation for different diseases.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Bio–Syndromic Surveillance

Consider the scenario where public health officials aremonitoring a seasonal outbreak of a disease.Syndromic surveillance/biosurveillance data of clinicalsymptoms such as

fevernumber of hospital admissionsover–the–counter medication consumptionrespiratory complaintsschool or work absences, etc.,

While this data is readily available, it does not directly provideaccurate numerical quantification of the size of the outbreak.Noise in the data causes inaccuracy of any specific numericalassessments or predictions.Symptoms such as fever and respiratory complaints havedifferent levels of correlation for different diseases.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Bio–Syndromic Surveillance

Consider the scenario where public health officials aremonitoring a seasonal outbreak of a disease.Syndromic surveillance/biosurveillance data of clinicalsymptoms such as

fevernumber of hospital admissionsover–the–counter medication consumptionrespiratory complaintsschool or work absences, etc.,

While this data is readily available, it does not directly provideaccurate numerical quantification of the size of the outbreak.Noise in the data causes inaccuracy of any specific numericalassessments or predictions.Symptoms such as fever and respiratory complaints havedifferent levels of correlation for different diseases.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Bio–Syndromic Surveillance

Consider the scenario where public health officials aremonitoring a seasonal outbreak of a disease.Syndromic surveillance/biosurveillance data of clinicalsymptoms such as

fevernumber of hospital admissionsover–the–counter medication consumptionrespiratory complaintsschool or work absences, etc.,

While this data is readily available, it does not directly provideaccurate numerical quantification of the size of the outbreak.Noise in the data causes inaccuracy of any specific numericalassessments or predictions.Symptoms such as fever and respiratory complaints havedifferent levels of correlation for different diseases.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Bio–Syndromic Surveillance

Consider the scenario where public health officials aremonitoring a seasonal outbreak of a disease.Syndromic surveillance/biosurveillance data of clinicalsymptoms such as

fevernumber of hospital admissionsover–the–counter medication consumptionrespiratory complaintsschool or work absences, etc.,

While this data is readily available, it does not directly provideaccurate numerical quantification of the size of the outbreak.Noise in the data causes inaccuracy of any specific numericalassessments or predictions.Symptoms such as fever and respiratory complaints havedifferent levels of correlation for different diseases.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Principal Component Analysis

Principal component analysis (PCA) is a powerful method ofmodern data analysis that provides a systematic way to reducethe dimension of a complex data set to a lower dimension.

Can reveal hidden simplified structures that would otherwise gounnoticed.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Principal Component Analysis

Principal component analysis (PCA) is a powerful method ofmodern data analysis that provides a systematic way to reducethe dimension of a complex data set to a lower dimension.

Can reveal hidden simplified structures that would otherwise gounnoticed.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Principal Component Analysis

Consider an M × N matrix of data measurements A∼ with M datatypes and N observations of each data type.

Each M × 1 column of A∼ represents the measurement of data atsome time tn for which there are N time samples.

A∼ =

· · · a1j = Temperature · · ·

a2j = # Ca2+ in gap junction... a3j = Reflectance

......

· · · aMj = Voltage · · ·

M×N

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Principal Component Analysis

Consider an M × N matrix of data measurements A∼ with M datatypes and N observations of each data type.

Each M × 1 column of A∼ represents the measurement of data atsome time tn for which there are N time samples.

A∼ =

· · · a1j = Temperature · · ·

a2j = # Ca2+ in gap junction... a3j = Reflectance

......

· · · aMj = Voltage · · ·

M×N

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Principal Component Analysis

Consider an M × N matrix of data measurements A∼ with M datatypes and N observations of each data type.

Each M × 1 column of A∼ represents the measurement of data atsome time tn for which there are N time samples.

A∼ =

· · · a1j = Temperature · · ·

a2j = # Ca2+ in gap junction... a3j = Reflectance

......

· · · aMj = Voltage · · ·

M×N

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Principal Component Analysis

Since any M × 1 vector lies in an M–dimensional vector space,then there exists an M–dimensional orthonormal basis that spansthe vector space.

Goal of PCA is to transform the noisy, and possibly redundantdata set to a lower dimensional orthonormal basis.

New basis will filter out the noisy data and reveal hiddenstructures among the data types.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Principal Component Analysis

Since any M × 1 vector lies in an M–dimensional vector space,then there exists an M–dimensional orthonormal basis that spansthe vector space.

Goal of PCA is to transform the noisy, and possibly redundantdata set to a lower dimensional orthonormal basis.

New basis will filter out the noisy data and reveal hiddenstructures among the data types.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Principal Component Analysis

Since any M × 1 vector lies in an M–dimensional vector space,then there exists an M–dimensional orthonormal basis that spansthe vector space.

Goal of PCA is to transform the noisy, and possibly redundantdata set to a lower dimensional orthonormal basis.

New basis will filter out the noisy data and reveal hiddenstructures among the data types.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Singular Value Decomposition (SVD)

Let A∼ be a real M × N matrix and let r denote the rank of A∼.SVD defines a particular factorization as A∼ = U∼ Σ∼ V∼

T where

U∼ is an M ×M orthogonal matrix ie.,(

U∼TU∼ = I∼M×M

)V∼ is an N × N orthogonal matrix ie.,

(V∼

TV∼ = I∼N×N

)the M × N diagonal matrix Σ∼ of singular valuesσ1 ≥ σ2 ≥ · · · ≥ σr > 0; σr+1 = · · · = σp = 0 andp := min(M,N)

Σ∼ =

σ1 0∼. . .

σr

0. . .

0∼ 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Singular Value Decomposition (SVD)

Let A∼ be a real M × N matrix and let r denote the rank of A∼.SVD defines a particular factorization as A∼ = U∼ Σ∼ V∼

T where

U∼ is an M ×M orthogonal matrix ie.,(

U∼TU∼ = I∼M×M

)V∼ is an N × N orthogonal matrix ie.,

(V∼

TV∼ = I∼N×N

)the M × N diagonal matrix Σ∼ of singular valuesσ1 ≥ σ2 ≥ · · · ≥ σr > 0; σr+1 = · · · = σp = 0 andp := min(M,N)

Σ∼ =

σ1 0∼. . .

σr

0. . .

0∼ 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Singular Value Decomposition (SVD)

Let A∼ be a real M × N matrix and let r denote the rank of A∼.SVD defines a particular factorization as A∼ = U∼ Σ∼ V∼

T where

U∼ is an M ×M orthogonal matrix ie.,(

U∼TU∼ = I∼M×M

)V∼ is an N × N orthogonal matrix ie.,

(V∼

TV∼ = I∼N×N

)the M × N diagonal matrix Σ∼ of singular valuesσ1 ≥ σ2 ≥ · · · ≥ σr > 0; σr+1 = · · · = σp = 0 andp := min(M,N)

Σ∼ =

σ1 0∼. . .

σr

0. . .

0∼ 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Singular Value Decomposition (SVD)

Let A∼ be a real M × N matrix and let r denote the rank of A∼.SVD defines a particular factorization as A∼ = U∼ Σ∼ V∼

T where

U∼ is an M ×M orthogonal matrix ie.,(

U∼TU∼ = I∼M×M

)V∼ is an N × N orthogonal matrix ie.,

(V∼

TV∼ = I∼N×N

)the M × N diagonal matrix Σ∼ of singular valuesσ1 ≥ σ2 ≥ · · · ≥ σr > 0; σr+1 = · · · = σp = 0 andp := min(M,N)

Σ∼ =

σ1 0∼. . .

σr

0. . .

0∼ 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Singular Value Decomposition (SVD)

Let A∼ be a real M × N matrix and let r denote the rank of A∼.SVD defines a particular factorization as A∼ = U∼ Σ∼ V∼

T where

U∼ is an M ×M orthogonal matrix ie.,(

U∼TU∼ = I∼M×M

)V∼ is an N × N orthogonal matrix ie.,

(V∼

TV∼ = I∼N×N

)the M × N diagonal matrix Σ∼ of singular valuesσ1 ≥ σ2 ≥ · · · ≥ σr > 0; σr+1 = · · · = σp = 0 andp := min(M,N)

Σ∼ =

σ1 0∼. . .

σr

0. . .

0∼ 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Singular Value Decomposition (SVD)

Let A∼ be a real M × N matrix and let r denote the rank of A∼.SVD defines a particular factorization as A∼ = U∼ Σ∼ V∼

T where

U∼ is an M ×M orthogonal matrix ie.,(

U∼TU∼ = I∼M×M

)V∼ is an N × N orthogonal matrix ie.,

(V∼

TV∼ = I∼N×N

)the M × N diagonal matrix Σ∼ of singular valuesσ1 ≥ σ2 ≥ · · · ≥ σr > 0; σr+1 = · · · = σp = 0 andp := min(M,N)

Σ∼ =

σ1 0∼. . .

σr

0. . .

0∼ 0

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Singular Value Decomposition (SVD)

Find the M columns ~u(m) (called the left singular vectors) of U∼,

and the N columns~v(n) (called the right singular vectors) of V∼,where

U∼ :=(~u(1) ~u(2) · · · ~u(M)

), V∼ :=

(~v(1) ~v(2) · · · ~v(N)

)by solving the singular value problems

A∼~v = σ~u, and A∼T~u = σ~v

We will first find ∂σ/∂aij

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Singular Value Decomposition (SVD)

Find the M columns ~u(m) (called the left singular vectors) of U∼,

and the N columns~v(n) (called the right singular vectors) of V∼,where

U∼ :=(~u(1) ~u(2) · · · ~u(M)

), V∼ :=

(~v(1) ~v(2) · · · ~v(N)

)by solving the singular value problems

A∼~v = σ~u, and A∼T~u = σ~v

We will first find ∂σ/∂aij

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Singular Value Decomposition (SVD)

Find the M columns ~u(m) (called the left singular vectors) of U∼,

and the N columns~v(n) (called the right singular vectors) of V∼,where

U∼ :=(~u(1) ~u(2) · · · ~u(M)

), V∼ :=

(~v(1) ~v(2) · · · ~v(N)

)by solving the singular value problems

A∼~v = σ~u, and A∼T~u = σ~v

We will first find ∂σ/∂aij

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Differentiate the singular problems A∼~v = σ~u and A∼T~u = σ~v to

get the FSEs

A∼∂~v∂aij

+∂A∼∂aij

~v = σ∂~u∂aij

+∂σ

∂aij~u

A∼T ∂~u∂aij

+∂A∼

T

∂aij~u = σ

∂~v∂aij

+∂σ

∂aij~v

Problem–3 unknowns but only 2 equations

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Differentiate the singular problems A∼~v = σ~u and A∼T~u = σ~v to

get the FSEs

A∼∂~v∂aij

+∂A∼∂aij

~v = σ∂~u∂aij

+∂σ

∂aij~u

A∼T ∂~u∂aij

+∂A∼

T

∂aij~u = σ

∂~v∂aij

+∂σ

∂aij~v

Problem–3 unknowns but only 2 equations

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Differentiate the singular problems A∼~v = σ~u and A∼T~u = σ~v to

get the FSEs

A∼∂~v∂aij

+∂A∼∂aij

~v = σ∂~u∂aij

+∂σ

∂aij~u

A∼T ∂~u∂aij

+∂A∼

T

∂aij~u = σ

∂~v∂aij

+∂σ

∂aij~v

Problem–3 unknowns but only 2 equations

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Differentiate the singular problems A∼~v = σ~u and A∼T~u = σ~v to

get the FSEs

A∼∂~v∂aij

+∂A∼∂aij

~v = σ∂~u∂aij

+∂σ

∂aij~u

A∼T ∂~u∂aij

+∂A∼

T

∂aij~u = σ

∂~v∂aij

+∂σ

∂aij~v

Problem–3 unknowns but only 2 equations

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Differentiate the singular problems A∼~v = σ~u and A∼T~u = σ~v to

get the FSEs

A∼∂~v∂aij

+∂A∼∂aij

~v = σ∂~u∂aij

+∂σ

∂aij~u

A∼T ∂~u∂aij

+∂A∼

T

∂aij~u = σ

∂~v∂aij

+∂σ

∂aij~v

Problem–3 unknowns but only 2 equations

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Since U∼ and V∼ are unitary, the associated singular matrices U∼and V∼ are normalized, i.e., U∼

TU∼ = I∼ and V∼TV∼ = I∼

In which case ~uT~u = 1 and~vT~v = 1Using this result we find the orthogonality condition

~uT ∂~u∂aij

= 0 and ~vT ∂~v∂aij

= 0

Premultiply the FSE A∼∂~v∂aij

+∂A∼∂aij~v = σ ∂~u

∂aij+ ∂σ

∂aij~u by ~uT and,

using the orthogonality & normalizing conditions the FSEreduces to

~uTA∼∂~v∂aij

+~uT∂A∼∂aij

~v = σ~uT ∂~u∂aij︸ ︷︷ ︸

= 0

+∂σ

∂aij~uT ~u︸︷︷︸= 1

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Since U∼ and V∼ are unitary, the associated singular matrices U∼and V∼ are normalized, i.e., U∼

TU∼ = I∼ and V∼TV∼ = I∼

In which case ~uT~u = 1 and~vT~v = 1Using this result we find the orthogonality condition

~uT ∂~u∂aij

= 0 and ~vT ∂~v∂aij

= 0

Premultiply the FSE A∼∂~v∂aij

+∂A∼∂aij~v = σ ∂~u

∂aij+ ∂σ

∂aij~u by ~uT and,

using the orthogonality & normalizing conditions the FSEreduces to

~uTA∼∂~v∂aij

+~uT∂A∼∂aij

~v = σ~uT ∂~u∂aij︸ ︷︷ ︸

= 0

+∂σ

∂aij~uT ~u︸︷︷︸= 1

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Since U∼ and V∼ are unitary, the associated singular matrices U∼and V∼ are normalized, i.e., U∼

TU∼ = I∼ and V∼TV∼ = I∼

In which case ~uT~u = 1 and~vT~v = 1Using this result we find the orthogonality condition

~uT ∂~u∂aij

= 0 and ~vT ∂~v∂aij

= 0

Premultiply the FSE A∼∂~v∂aij

+∂A∼∂aij~v = σ ∂~u

∂aij+ ∂σ

∂aij~u by ~uT and,

using the orthogonality & normalizing conditions the FSEreduces to

~uTA∼∂~v∂aij

+~uT∂A∼∂aij

~v = σ~uT ∂~u∂aij︸ ︷︷ ︸

= 0

+∂σ

∂aij~uT ~u︸︷︷︸= 1

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Since U∼ and V∼ are unitary, the associated singular matrices U∼and V∼ are normalized, i.e., U∼

TU∼ = I∼ and V∼TV∼ = I∼

In which case ~uT~u = 1 and~vT~v = 1Using this result we find the orthogonality condition

~uT ∂~u∂aij

= 0 and ~vT ∂~v∂aij

= 0

Premultiply the FSE A∼∂~v∂aij

+∂A∼∂aij~v = σ ∂~u

∂aij+ ∂σ

∂aij~u by ~uT and,

using the orthogonality & normalizing conditions the FSEreduces to

~uTA∼∂~v∂aij

+~uT∂A∼∂aij

~v = σ~uT ∂~u∂aij︸ ︷︷ ︸

= 0

+∂σ

∂aij~uT ~u︸︷︷︸= 1

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Rewrite the singular problem A∼T~u = σ~vT as ~uTA∼ = σ~vT

Use this result with the orthogonality condition to eliminate thefirst term to get

∂σ

∂aij= ~uTA∼

∂~v∂aij

+~uT∂A∼∂aij

~v

= σ~vT ∂~v∂aij︸ ︷︷ ︸

= 0

+~uT∂A∼∂aij

~v

= ~uT∂A∼∂aij

~v

∂σ

∂aij= uivj

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Rewrite the singular problem A∼T~u = σ~vT as ~uTA∼ = σ~vT

Use this result with the orthogonality condition to eliminate thefirst term to get

∂σ

∂aij= ~uTA∼

∂~v∂aij

+~uT∂A∼∂aij

~v

= σ~vT ∂~v∂aij︸ ︷︷ ︸

= 0

+~uT∂A∼∂aij

~v

= ~uT∂A∼∂aij

~v

∂σ

∂aij= uivj

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Rewrite the singular problem A∼T~u = σ~vT as ~uTA∼ = σ~vT

Use this result with the orthogonality condition to eliminate thefirst term to get

∂σ

∂aij= ~uTA∼

∂~v∂aij

+~uT∂A∼∂aij

~v

= σ~vT ∂~v∂aij︸ ︷︷ ︸

= 0

+~uT∂A∼∂aij

~v

= ~uT∂A∼∂aij

~v

∂σ

∂aij= uivj

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Rewrite the singular problem A∼T~u = σ~vT as ~uTA∼ = σ~vT

Use this result with the orthogonality condition to eliminate thefirst term to get

∂σ

∂aij= ~uTA∼

∂~v∂aij

+~uT∂A∼∂aij

~v

= σ~vT ∂~v∂aij︸ ︷︷ ︸

= 0

+~uT∂A∼∂aij

~v

= ~uT∂A∼∂aij

~v

∂σ

∂aij= uivj

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Rewrite the singular problem A∼T~u = σ~vT as ~uTA∼ = σ~vT

Use this result with the orthogonality condition to eliminate thefirst term to get

∂σ

∂aij= ~uTA∼

∂~v∂aij

+~uT∂A∼∂aij

~v

= σ~vT ∂~v∂aij︸ ︷︷ ︸

= 0

+~uT∂A∼∂aij

~v

= ~uT∂A∼∂aij

~v

∂σ

∂aij= uivj

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Rewrite the singular problem A∼T~u = σ~vT as ~uTA∼ = σ~vT

Use this result with the orthogonality condition to eliminate thefirst term to get

∂σ

∂aij= ~uTA∼

∂~v∂aij

+~uT∂A∼∂aij

~v

= σ~vT ∂~v∂aij︸ ︷︷ ︸

= 0

+~uT∂A∼∂aij

~v

= ~uT∂A∼∂aij

~v

∂σ

∂aij= uivj

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Since the derivative of the singular vector is in RM it can bewritten as a linear combination of the singular vectors.Define the unknown coefficient matrix as

C∼ :=

c1

(1) c1(2) c1

(3) · · · c1(M)

c2(1) c2

(2) c2(3) · · · c2

(M)

......

...cM

(1) cM(2) cM

(3) · · · cM(M)

In which case the derivative of the singular matrix can be writtenas

∂U∼∂aij

= U∼C∼Singular problems can be written in matrix form

A∼ V∼ = U∼ Σ∼ and A∼T U∼ = V∼ Σ∼

T

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Since the derivative of the singular vector is in RM it can bewritten as a linear combination of the singular vectors.Define the unknown coefficient matrix as

C∼ :=

c1

(1) c1(2) c1

(3) · · · c1(M)

c2(1) c2

(2) c2(3) · · · c2

(M)

......

...cM

(1) cM(2) cM

(3) · · · cM(M)

In which case the derivative of the singular matrix can be writtenas

∂U∼∂aij

= U∼C∼Singular problems can be written in matrix form

A∼ V∼ = U∼ Σ∼ and A∼T U∼ = V∼ Σ∼

T

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Since the derivative of the singular vector is in RM it can bewritten as a linear combination of the singular vectors.Define the unknown coefficient matrix as

C∼ :=

c1

(1) c1(2) c1

(3) · · · c1(M)

c2(1) c2

(2) c2(3) · · · c2

(M)

......

...cM

(1) cM(2) cM

(3) · · · cM(M)

In which case the derivative of the singular matrix can be writtenas

∂U∼∂aij

= U∼C∼Singular problems can be written in matrix form

A∼ V∼ = U∼ Σ∼ and A∼T U∼ = V∼ Σ∼

T

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Since the derivative of the singular vector is in RM it can bewritten as a linear combination of the singular vectors.Define the unknown coefficient matrix as

C∼ :=

c1

(1) c1(2) c1

(3) · · · c1(M)

c2(1) c2

(2) c2(3) · · · c2

(M)

......

...cM

(1) cM(2) cM

(3) · · · cM(M)

In which case the derivative of the singular matrix can be writtenas

∂U∼∂aij

= U∼C∼Singular problems can be written in matrix form

A∼ V∼ = U∼ Σ∼ and A∼T U∼ = V∼ Σ∼

T

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Differentiating the singular matrix equation A∼T U∼ = V∼ Σ∼

T gives

A∼T∂U∼∂aij

+∂A∼

T

∂aijU∼ = V∼

∂Σ∼T

∂aij+∂V∼∂aij

Σ∼T

Using the fact that ∂U∼/∂aij can be written as a linearcombination of the singular vectors U∼ we get

A∼T U∼C∼−

∂V∼∂aij

Σ∼T = V∼

∂Σ∼T

∂aij−∂A∼

T

∂aijU∼

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Differentiating the singular matrix equation A∼T U∼ = V∼ Σ∼

T gives

A∼T∂U∼∂aij

+∂A∼

T

∂aijU∼ = V∼

∂Σ∼T

∂aij+∂V∼∂aij

Σ∼T

Using the fact that ∂U∼/∂aij can be written as a linearcombination of the singular vectors U∼ we get

A∼T U∼C∼−

∂V∼∂aij

Σ∼T = V∼

∂Σ∼T

∂aij−∂A∼

T

∂aijU∼

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Differentiate the singular problem A∼ V∼ = U∼ Σ∼ to obtain

A∼∂V∼∂aij

= U∼∂Σ∼∂aij

+ U∼C∼Σ∼−∂A∼∂aij

V∼

Premultiply A∼T U∼C∼−

∂V∼∂aij

Σ∼T = V∼

∂Σ∼T

∂aij−

∂A∼T

∂aijU∼ by matrix A∼

A∼A∼TU∼C∼− A∼

∂V∼∂aij

Σ∼T = A∼V∼

∂Σ∼T

∂aij− A∼

∂A∼T

∂aijU∼

A∼A∼TU∼C∼−

(U∼∂Σ∼∂aij

+ U∼C∼Σ∼ −∂A∼∂aij

V∼

)Σ∼

T = A∼V∼∂Σ∼

T

∂aij

− A∼∂A∼

T

∂aijU∼

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Differentiate the singular problem A∼ V∼ = U∼ Σ∼ to obtain

A∼∂V∼∂aij

= U∼∂Σ∼∂aij

+ U∼C∼Σ∼−∂A∼∂aij

V∼

Premultiply A∼T U∼C∼−

∂V∼∂aij

Σ∼T = V∼

∂Σ∼T

∂aij−

∂A∼T

∂aijU∼ by matrix A∼

A∼A∼TU∼C∼− A∼

∂V∼∂aij

Σ∼T = A∼V∼

∂Σ∼T

∂aij− A∼

∂A∼T

∂aijU∼

A∼A∼TU∼C∼−

(U∼∂Σ∼∂aij

+ U∼C∼Σ∼ −∂A∼∂aij

V∼

)Σ∼

T = A∼V∼∂Σ∼

T

∂aij

− A∼∂A∼

T

∂aijU∼

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Rearranging so as to isolate the expressions containing U∼C∼, onthe left side of the equation, we get

A∼A∼TU∼C∼−U∼C∼Σ∼Σ∼

T = A∼V∼∂Σ∼

T

∂aij−A∼

∂A∼T

∂aijU∼+U∼

∂Σ∼∂aij

Σ∼T−

∂A∼∂aij

V∼Σ∼T

In order to simplify this result, consider the left side of thisequation

A∼A∼TU∼C∼− U∼C∼Σ∼Σ∼

T = A∼V∼Σ∼TC∼− U∼C∼Σ∼Σ∼

T

= U∼Σ∼Σ∼TC∼− U∼C∼Σ∼Σ∼

T

= U∼[Σ∼Σ∼

T ,C∼]

where [·, ] denotes the commutator bracket[Σ∼Σ∼

T ,C∼]

:= Σ∼Σ∼TC∼− C∼Σ∼Σ∼

T

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Rearranging so as to isolate the expressions containing U∼C∼, onthe left side of the equation, we get

A∼A∼TU∼C∼−U∼C∼Σ∼Σ∼

T = A∼V∼∂Σ∼

T

∂aij−A∼

∂A∼T

∂aijU∼+U∼

∂Σ∼∂aij

Σ∼T−

∂A∼∂aij

V∼Σ∼T

In order to simplify this result, consider the left side of thisequation

A∼A∼TU∼C∼− U∼C∼Σ∼Σ∼

T = A∼V∼Σ∼TC∼− U∼C∼Σ∼Σ∼

T

= U∼Σ∼Σ∼TC∼− U∼C∼Σ∼Σ∼

T

= U∼[Σ∼Σ∼

T ,C∼]

where [·, ] denotes the commutator bracket[Σ∼Σ∼

T ,C∼]

:= Σ∼Σ∼TC∼− C∼Σ∼Σ∼

T

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Rearranging so as to isolate the expressions containing U∼C∼, onthe left side of the equation, we get

A∼A∼TU∼C∼−U∼C∼Σ∼Σ∼

T = A∼V∼∂Σ∼

T

∂aij−A∼

∂A∼T

∂aijU∼+U∼

∂Σ∼∂aij

Σ∼T−

∂A∼∂aij

V∼Σ∼T

In order to simplify this result, consider the left side of thisequation

A∼A∼TU∼C∼− U∼C∼Σ∼Σ∼

T = A∼V∼Σ∼TC∼− U∼C∼Σ∼Σ∼

T

= U∼Σ∼Σ∼TC∼− U∼C∼Σ∼Σ∼

T

= U∼[Σ∼Σ∼

T ,C∼]

where [·, ] denotes the commutator bracket[Σ∼Σ∼

T ,C∼]

:= Σ∼Σ∼TC∼− C∼Σ∼Σ∼

T

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Rearranging so as to isolate the expressions containing U∼C∼, onthe left side of the equation, we get

A∼A∼TU∼C∼−U∼C∼Σ∼Σ∼

T = A∼V∼∂Σ∼

T

∂aij−A∼

∂A∼T

∂aijU∼+U∼

∂Σ∼∂aij

Σ∼T−

∂A∼∂aij

V∼Σ∼T

In order to simplify this result, consider the left side of thisequation

A∼A∼TU∼C∼− U∼C∼Σ∼Σ∼

T = A∼V∼Σ∼TC∼− U∼C∼Σ∼Σ∼

T

= U∼Σ∼Σ∼TC∼− U∼C∼Σ∼Σ∼

T

= U∼[Σ∼Σ∼

T ,C∼]

where [·, ] denotes the commutator bracket[Σ∼Σ∼

T ,C∼]

:= Σ∼Σ∼TC∼− C∼Σ∼Σ∼

T

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Rearranging so as to isolate the expressions containing U∼C∼, onthe left side of the equation, we get

A∼A∼TU∼C∼−U∼C∼Σ∼Σ∼

T = A∼V∼∂Σ∼

T

∂aij−A∼

∂A∼T

∂aijU∼+U∼

∂Σ∼∂aij

Σ∼T−

∂A∼∂aij

V∼Σ∼T

In order to simplify this result, consider the left side of thisequation

A∼A∼TU∼C∼− U∼C∼Σ∼Σ∼

T = A∼V∼Σ∼TC∼− U∼C∼Σ∼Σ∼

T

= U∼Σ∼Σ∼TC∼− U∼C∼Σ∼Σ∼

T

= U∼[Σ∼Σ∼

T ,C∼]

where [·, ] denotes the commutator bracket[Σ∼Σ∼

T ,C∼]

:= Σ∼Σ∼TC∼− C∼Σ∼Σ∼

T

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Rearranging so as to isolate the expressions containing U∼C∼, onthe left side of the equation, we get

A∼A∼TU∼C∼−U∼C∼Σ∼Σ∼

T = A∼V∼∂Σ∼

T

∂aij−A∼

∂A∼T

∂aijU∼+U∼

∂Σ∼∂aij

Σ∼T−

∂A∼∂aij

V∼Σ∼T

In order to simplify this result, consider the left side of thisequation

A∼A∼TU∼C∼− U∼C∼Σ∼Σ∼

T = A∼V∼Σ∼TC∼− U∼C∼Σ∼Σ∼

T

= U∼Σ∼Σ∼TC∼− U∼C∼Σ∼Σ∼

T

= U∼[Σ∼Σ∼

T ,C∼]

where [·, ] denotes the commutator bracket[Σ∼Σ∼

T ,C∼]

:= Σ∼Σ∼TC∼− C∼Σ∼Σ∼

T

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Rewrite the expression

A∼V∼∂Σ∼

T

∂aij+ U∼

∂Σ∼∂aij

Σ∼T = U∼Σ∼

∂Σ∼T

∂aij+ U∼

∂Σ∼∂aij

Σ∼T

= U∼∂

∂aij

[Σ∼Σ∼

T]

Next rewrite the expression

A∼∂A∼

T

∂aijU∼+

∂A∼∂aij

V∼Σ∼T = A∼

∂A∼T

∂aijU∼+

∂A∼∂aij

A∼TU∼

=(

∂aij

[A∼A∼

T])

U∼

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Rewrite the expression

A∼V∼∂Σ∼

T

∂aij+ U∼

∂Σ∼∂aij

Σ∼T = U∼Σ∼

∂Σ∼T

∂aij+ U∼

∂Σ∼∂aij

Σ∼T

= U∼∂

∂aij

[Σ∼Σ∼

T]

Next rewrite the expression

A∼∂A∼

T

∂aijU∼+

∂A∼∂aij

V∼Σ∼T = A∼

∂A∼T

∂aijU∼+

∂A∼∂aij

A∼TU∼

=(

∂aij

[A∼A∼

T])

U∼

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Rewrite the expression

A∼V∼∂Σ∼

T

∂aij+ U∼

∂Σ∼∂aij

Σ∼T = U∼Σ∼

∂Σ∼T

∂aij+ U∼

∂Σ∼∂aij

Σ∼T

= U∼∂

∂aij

[Σ∼Σ∼

T]

Next rewrite the expression

A∼∂A∼

T

∂aijU∼+

∂A∼∂aij

V∼Σ∼T = A∼

∂A∼T

∂aijU∼+

∂A∼∂aij

A∼TU∼

=(

∂aij

[A∼A∼

T])

U∼

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Rewrite the expression

A∼V∼∂Σ∼

T

∂aij+ U∼

∂Σ∼∂aij

Σ∼T = U∼Σ∼

∂Σ∼T

∂aij+ U∼

∂Σ∼∂aij

Σ∼T

= U∼∂

∂aij

[Σ∼Σ∼

T]

Next rewrite the expression

A∼∂A∼

T

∂aijU∼+

∂A∼∂aij

V∼Σ∼T = A∼

∂A∼T

∂aijU∼+

∂A∼∂aij

A∼TU∼

=(

∂aij

[A∼A∼

T])

U∼

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

Rewrite the expression

A∼V∼∂Σ∼

T

∂aij+ U∼

∂Σ∼∂aij

Σ∼T = U∼Σ∼

∂Σ∼T

∂aij+ U∼

∂Σ∼∂aij

Σ∼T

= U∼∂

∂aij

[Σ∼Σ∼

T]

Next rewrite the expression

A∼∂A∼

T

∂aijU∼+

∂A∼∂aij

V∼Σ∼T = A∼

∂A∼T

∂aijU∼+

∂A∼∂aij

A∼TU∼

=(

∂aij

[A∼A∼

T])

U∼

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

These simplifications gives the system of equations in ck(l)

U∼[Σ∼Σ∼

T ,C∼]

= U∼∂

∂aij

[Σ∼Σ∼

T]−(

∂aij

[A∼A∼

T])

U∼Using the unitary condition the commutator bracket simplifies tothe final form[

Σ∼Σ∼T ,C∼

]=

∂aij

[Σ∼Σ∼

T]− U∼

T(

∂aij

[A∼A∼

T])

U∼Expanding the commutator bracket we find that

[Σ∼Σ∼

T ,C∼

]kl

=

0 k = l or k and l > rck

(l)((σk)2 − (σl)2

)k, l ≤ r

−ck(l)(σl)2 l ≤ r, k ≥ r + 1

ck(l)(σk)2 k ≤ r, l ≥ r + 1

We assumed the singular values are distinct, so we cansolve for the off–diagonal coefficients.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

These simplifications gives the system of equations in ck(l)

U∼[Σ∼Σ∼

T ,C∼]

= U∼∂

∂aij

[Σ∼Σ∼

T]−(

∂aij

[A∼A∼

T])

U∼Using the unitary condition the commutator bracket simplifies tothe final form[

Σ∼Σ∼T ,C∼

]=

∂aij

[Σ∼Σ∼

T]− U∼

T(

∂aij

[A∼A∼

T])

U∼Expanding the commutator bracket we find that

[Σ∼Σ∼

T ,C∼

]kl

=

0 k = l or k and l > rck

(l)((σk)2 − (σl)2

)k, l ≤ r

−ck(l)(σl)2 l ≤ r, k ≥ r + 1

ck(l)(σk)2 k ≤ r, l ≥ r + 1

We assumed the singular values are distinct, so we cansolve for the off–diagonal coefficients.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

These simplifications gives the system of equations in ck(l)

U∼[Σ∼Σ∼

T ,C∼]

= U∼∂

∂aij

[Σ∼Σ∼

T]−(

∂aij

[A∼A∼

T])

U∼Using the unitary condition the commutator bracket simplifies tothe final form[

Σ∼Σ∼T ,C∼

]=

∂aij

[Σ∼Σ∼

T]− U∼

T(

∂aij

[A∼A∼

T])

U∼Expanding the commutator bracket we find that

[Σ∼Σ∼

T ,C∼

]kl

=

0 k = l or k and l > rck

(l)((σk)2 − (σl)2

)k, l ≤ r

−ck(l)(σl)2 l ≤ r, k ≥ r + 1

ck(l)(σk)2 k ≤ r, l ≥ r + 1

We assumed the singular values are distinct, so we cansolve for the off–diagonal coefficients.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

These simplifications gives the system of equations in ck(l)

U∼[Σ∼Σ∼

T ,C∼]

= U∼∂

∂aij

[Σ∼Σ∼

T]−(

∂aij

[A∼A∼

T])

U∼Using the unitary condition the commutator bracket simplifies tothe final form[

Σ∼Σ∼T ,C∼

]=

∂aij

[Σ∼Σ∼

T]− U∼

T(

∂aij

[A∼A∼

T])

U∼Expanding the commutator bracket we find that

[Σ∼Σ∼

T ,C∼

]kl

=

0 k = l or k and l > rck

(l)((σk)2 − (σl)2

)k, l ≤ r

−ck(l)(σl)2 l ≤ r, k ≥ r + 1

ck(l)(σk)2 k ≤ r, l ≥ r + 1

We assumed the singular values are distinct, so we cansolve for the off–diagonal coefficients.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

The next task is to find the values of the diagonal coefficients.Once again, we make use of the fact that the singular vectors{~u(k)} form a basis for RM, that is, for a fixed eigenvector ~u(k),the derivative is expanded as the sum

∂~u(k)

∂aij= c1

(k)~u(1) + · · ·+ ck(k)~u(k) + · · ·+ cM

(k)~u(M)

Since the derivative of the singular vector is orthogonal to thesingular vector we get

c1(k)〈~u(1),~u(k)〉+ · · ·+ ck

(k)〈~u(k),~u(k)〉+ · · ·+cM

(k)〈~u(M),~u(k)〉 = 0

Since the individual singular vectors are orthonormal, thediagonal coefficients are all identically zero.Using similar methods we can find ∂V∼/∂aij.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

The next task is to find the values of the diagonal coefficients.Once again, we make use of the fact that the singular vectors{~u(k)} form a basis for RM, that is, for a fixed eigenvector ~u(k),the derivative is expanded as the sum

∂~u(k)

∂aij= c1

(k)~u(1) + · · ·+ ck(k)~u(k) + · · ·+ cM

(k)~u(M)

Since the derivative of the singular vector is orthogonal to thesingular vector we get

c1(k)〈~u(1),~u(k)〉+ · · ·+ ck

(k)〈~u(k),~u(k)〉+ · · ·+cM

(k)〈~u(M),~u(k)〉 = 0

Since the individual singular vectors are orthonormal, thediagonal coefficients are all identically zero.Using similar methods we can find ∂V∼/∂aij.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

The next task is to find the values of the diagonal coefficients.Once again, we make use of the fact that the singular vectors{~u(k)} form a basis for RM, that is, for a fixed eigenvector ~u(k),the derivative is expanded as the sum

∂~u(k)

∂aij= c1

(k)~u(1) + · · ·+ ck(k)~u(k) + · · ·+ cM

(k)~u(M)

Since the derivative of the singular vector is orthogonal to thesingular vector we get

c1(k)〈~u(1),~u(k)〉+ · · ·+ ck

(k)〈~u(k),~u(k)〉+ · · ·+cM

(k)〈~u(M),~u(k)〉 = 0

Since the individual singular vectors are orthonormal, thediagonal coefficients are all identically zero.Using similar methods we can find ∂V∼/∂aij.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

The next task is to find the values of the diagonal coefficients.Once again, we make use of the fact that the singular vectors{~u(k)} form a basis for RM, that is, for a fixed eigenvector ~u(k),the derivative is expanded as the sum

∂~u(k)

∂aij= c1

(k)~u(1) + · · ·+ ck(k)~u(k) + · · ·+ cM

(k)~u(M)

Since the derivative of the singular vector is orthogonal to thesingular vector we get

c1(k)〈~u(1),~u(k)〉+ · · ·+ ck

(k)〈~u(k),~u(k)〉+ · · ·+cM

(k)〈~u(M),~u(k)〉 = 0

Since the individual singular vectors are orthonormal, thediagonal coefficients are all identically zero.Using similar methods we can find ∂V∼/∂aij.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of SVD

The next task is to find the values of the diagonal coefficients.Once again, we make use of the fact that the singular vectors{~u(k)} form a basis for RM, that is, for a fixed eigenvector ~u(k),the derivative is expanded as the sum

∂~u(k)

∂aij= c1

(k)~u(1) + · · ·+ ck(k)~u(k) + · · ·+ cM

(k)~u(M)

Since the derivative of the singular vector is orthogonal to thesingular vector we get

c1(k)〈~u(1),~u(k)〉+ · · ·+ ck

(k)〈~u(k),~u(k)〉+ · · ·+cM

(k)〈~u(M),~u(k)〉 = 0

Since the individual singular vectors are orthonormal, thediagonal coefficients are all identically zero.Using similar methods we can find ∂V∼/∂aij.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Formality of the Adjoint Method

Problems which are amenable to the adjoint methodology arethose that can be expressed in the form

F(u) = f ,

where F is a linear/nonlinear operator F : X → Y , and f is theforward forcing function.The domain and range X and Y are assumed to have sufficientlynice topological properties, for example X,Y ∈ H,S.Associated with the forward problem is the task of determiningthe sensitivity of some desired response function(al) J(u).The adjoint problem and adjoint variable v ∈ X arises throughthe calculation of the Gateaux derivative:

F′(u)v := limε→0

F(u + εv)− F(u)ε

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Formality of the Adjoint Method

Problems which are amenable to the adjoint methodology arethose that can be expressed in the form

F(u) = f ,

where F is a linear/nonlinear operator F : X → Y , and f is theforward forcing function.The domain and range X and Y are assumed to have sufficientlynice topological properties, for example X,Y ∈ H,S.Associated with the forward problem is the task of determiningthe sensitivity of some desired response function(al) J(u).The adjoint problem and adjoint variable v ∈ X arises throughthe calculation of the Gateaux derivative:

F′(u)v := limε→0

F(u + εv)− F(u)ε

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Formality of the Adjoint Method

Problems which are amenable to the adjoint methodology arethose that can be expressed in the form

F(u) = f ,

where F is a linear/nonlinear operator F : X → Y , and f is theforward forcing function.The domain and range X and Y are assumed to have sufficientlynice topological properties, for example X,Y ∈ H,S.Associated with the forward problem is the task of determiningthe sensitivity of some desired response function(al) J(u).The adjoint problem and adjoint variable v ∈ X arises throughthe calculation of the Gateaux derivative:

F′(u)v := limε→0

F(u + εv)− F(u)ε

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Formality of the Adjoint Method

Problems which are amenable to the adjoint methodology arethose that can be expressed in the form

F(u) = f ,

where F is a linear/nonlinear operator F : X → Y , and f is theforward forcing function.The domain and range X and Y are assumed to have sufficientlynice topological properties, for example X,Y ∈ H,S.Associated with the forward problem is the task of determiningthe sensitivity of some desired response function(al) J(u).The adjoint problem and adjoint variable v ∈ X arises throughthe calculation of the Gateaux derivative:

F′(u)v := limε→0

F(u + εv)− F(u)ε

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Formality of the Adjoint Method

The notation F′(u)v is intended to suggest that the operator Ftakes the forward variable u, and maps it to an operator F′, whichnow depends on both u as well as the adjoint variable v.

Formulate an extended representation of the operator F by usingthe intermediate–value theorem of nonlinear operators permits usto rewrite the forward operator F in extended form:

Φ(u)u = F(u),

The residual operator Φ is defined in integral form

Φ(u) :=∫ 1

τ=0F′(τu) dτ.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Formality of the Adjoint Method

The notation F′(u)v is intended to suggest that the operator Ftakes the forward variable u, and maps it to an operator F′, whichnow depends on both u as well as the adjoint variable v.

Formulate an extended representation of the operator F by usingthe intermediate–value theorem of nonlinear operators permits usto rewrite the forward operator F in extended form:

Φ(u)u = F(u),

The residual operator Φ is defined in integral form

Φ(u) :=∫ 1

τ=0F′(τu) dτ.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Formality of the Adjoint Method

The notation F′(u)v is intended to suggest that the operator Ftakes the forward variable u, and maps it to an operator F′, whichnow depends on both u as well as the adjoint variable v.

Formulate an extended representation of the operator F by usingthe intermediate–value theorem of nonlinear operators permits usto rewrite the forward operator F in extended form:

Φ(u)u = F(u),

The residual operator Φ is defined in integral form

Φ(u) :=∫ 1

τ=0F′(τu) dτ.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Formality of the Adjoint Method

Given that an appropriate inner product has been defined,consider the adjoint operation

〈Φ(u)v,w〉 = SC1 + 〈v,Φ†(u)w〉,

where SC1 denotes the 1st solvability condition, and Φ† denotesthe adjoint operator associated with the forward operator ΦWhen SC1 = 0, the result is referred to as the Lagrange identity.The associated generalized adjoint problem is defined as

Φ†(u)v = g,

where the adjoint forcing function g has not yet beenspecified.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Formality of the Adjoint Method

Given that an appropriate inner product has been defined,consider the adjoint operation

〈Φ(u)v,w〉 = SC1 + 〈v,Φ†(u)w〉,

where SC1 denotes the 1st solvability condition, and Φ† denotesthe adjoint operator associated with the forward operator ΦWhen SC1 = 0, the result is referred to as the Lagrange identity.The associated generalized adjoint problem is defined as

Φ†(u)v = g,

where the adjoint forcing function g has not yet beenspecified.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Formality of the Adjoint Method

Taking the dot product of the forward problem with the adjointsolution gives

〈Φ(u)u, v〉 = 〈f , v〉,

Taking the dot product of the adjoint problem with the forwardsolution gives

〈Φ†(u)v, u〉 = 〈g, u〉〈v,Φ(u)u〉 = 〈g, u〉

〈v, f 〉 = 〈g, u〉

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Formality of the Adjoint Method

Taking the dot product of the forward problem with the adjointsolution gives

〈Φ(u)u, v〉 = 〈f , v〉,

Taking the dot product of the adjoint problem with the forwardsolution gives

〈Φ†(u)v, u〉 = 〈g, u〉

〈v,Φ(u)u〉 = 〈g, u〉〈v, f 〉 = 〈g, u〉

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Formality of the Adjoint Method

Taking the dot product of the forward problem with the adjointsolution gives

〈Φ(u)u, v〉 = 〈f , v〉,

Taking the dot product of the adjoint problem with the forwardsolution gives

〈Φ†(u)v, u〉 = 〈g, u〉〈v,Φ(u)u〉 = 〈g, u〉

〈v, f 〉 = 〈g, u〉

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Formality of the Adjoint Method

Taking the dot product of the forward problem with the adjointsolution gives

〈Φ(u)u, v〉 = 〈f , v〉,

Taking the dot product of the adjoint problem with the forwardsolution gives

〈Φ†(u)v, u〉 = 〈g, u〉〈v,Φ(u)u〉 = 〈g, u〉

〈v, f 〉 = 〈g, u〉

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Formality of the Adjoint Method

Relating the forward and adjoint problems as 〈g, u〉 = 〈f , v〉The adjoint forcing function g is cleverly chosen so that〈g, u〉 = J(u)

Forward Problem︷ ︸︸ ︷Φ(u)u = f −−−−→

Adjoint Problem︷ ︸︸ ︷Φ†(u)v = gy y

Adjoint Product︷ ︸︸ ︷〈Φ(u)v, v〉 = 〈f , v〉

Forward Product︷ ︸︸ ︷〈Φ†(u)v, u〉 = 〈g, u〉y y

J(u) = 〈f , v〉︸ ︷︷ ︸Adjoint Response

J(u) = 〈g, u〉︸ ︷︷ ︸Forward Response

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Formality of the Adjoint Method

Relating the forward and adjoint problems as 〈g, u〉 = 〈f , v〉The adjoint forcing function g is cleverly chosen so that〈g, u〉 = J(u)

Forward Problem︷ ︸︸ ︷Φ(u)u = f −−−−→

Adjoint Problem︷ ︸︸ ︷Φ†(u)v = gy y

Adjoint Product︷ ︸︸ ︷〈Φ(u)v, v〉 = 〈f , v〉

Forward Product︷ ︸︸ ︷〈Φ†(u)v, u〉 = 〈g, u〉y y

J(u) = 〈f , v〉︸ ︷︷ ︸Adjoint Response

J(u) = 〈g, u〉︸ ︷︷ ︸Forward Response

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Formality of the Adjoint Method

Relating the forward and adjoint problems as 〈g, u〉 = 〈f , v〉The adjoint forcing function g is cleverly chosen so that〈g, u〉 = J(u)

Forward Problem︷ ︸︸ ︷Φ(u)u = f −−−−→

Adjoint Problem︷ ︸︸ ︷Φ†(u)v = gy y

Adjoint Product︷ ︸︸ ︷〈Φ(u)v, v〉 = 〈f , v〉

Forward Product︷ ︸︸ ︷〈Φ†(u)v, u〉 = 〈g, u〉y y

J(u) = 〈f , v〉︸ ︷︷ ︸Adjoint Response

J(u) = 〈g, u〉︸ ︷︷ ︸Forward Response

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Caveat Emptor: When the Adjoint Method Fails

“Let the buyer beware!”In order for an adjoint problem to be defined, an associated innerproduct structure must exist. No inner product =⇒ No adjoint.To determine the sensitivity of the associated functionalJ = J(u), using the adjoint methodology, the functional must becleverly written in terms of the inner product.Once an adjoint problem has been defined, if more than onesensitivity is required, (e.g., recall the case of the sensitivity ofSVD), additional information must be introduced to make furtherprogress.SA as discussed here is local in nature. The estimates ofderivatives are valid only in some “small” neighborhood of thespecified nominal values of the parameters. For a moreglobal approach, uncertainty quantification methodologyshould be used.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Caveat Emptor: When the Adjoint Method Fails

“Let the buyer beware!”In order for an adjoint problem to be defined, an associated innerproduct structure must exist. No inner product =⇒ No adjoint.To determine the sensitivity of the associated functionalJ = J(u), using the adjoint methodology, the functional must becleverly written in terms of the inner product.Once an adjoint problem has been defined, if more than onesensitivity is required, (e.g., recall the case of the sensitivity ofSVD), additional information must be introduced to make furtherprogress.SA as discussed here is local in nature. The estimates ofderivatives are valid only in some “small” neighborhood of thespecified nominal values of the parameters. For a moreglobal approach, uncertainty quantification methodologyshould be used.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Caveat Emptor: When the Adjoint Method Fails

“Let the buyer beware!”In order for an adjoint problem to be defined, an associated innerproduct structure must exist. No inner product =⇒ No adjoint.To determine the sensitivity of the associated functionalJ = J(u), using the adjoint methodology, the functional must becleverly written in terms of the inner product.Once an adjoint problem has been defined, if more than onesensitivity is required, (e.g., recall the case of the sensitivity ofSVD), additional information must be introduced to make furtherprogress.SA as discussed here is local in nature. The estimates ofderivatives are valid only in some “small” neighborhood of thespecified nominal values of the parameters. For a moreglobal approach, uncertainty quantification methodologyshould be used.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Caveat Emptor: When the Adjoint Method Fails

“Let the buyer beware!”In order for an adjoint problem to be defined, an associated innerproduct structure must exist. No inner product =⇒ No adjoint.To determine the sensitivity of the associated functionalJ = J(u), using the adjoint methodology, the functional must becleverly written in terms of the inner product.Once an adjoint problem has been defined, if more than onesensitivity is required, (e.g., recall the case of the sensitivity ofSVD), additional information must be introduced to make furtherprogress.SA as discussed here is local in nature. The estimates ofderivatives are valid only in some “small” neighborhood of thespecified nominal values of the parameters. For a moreglobal approach, uncertainty quantification methodologyshould be used.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Caveat Emptor: When the Adjoint Method Fails

“Let the buyer beware!”In order for an adjoint problem to be defined, an associated innerproduct structure must exist. No inner product =⇒ No adjoint.To determine the sensitivity of the associated functionalJ = J(u), using the adjoint methodology, the functional must becleverly written in terms of the inner product.Once an adjoint problem has been defined, if more than onesensitivity is required, (e.g., recall the case of the sensitivity ofSVD), additional information must be introduced to make furtherprogress.SA as discussed here is local in nature. The estimates ofderivatives are valid only in some “small” neighborhood of thespecified nominal values of the parameters. For a moreglobal approach, uncertainty quantification methodologyshould be used.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of the Doubling/Tripling Time

Suppose we have an IVP and we are interested in the time ittakes for the solution u = u(t) to double or triple its initial value

i.e., u(tD) = 2u0.

For example, we might wish to know how the doubling time forthe number of people infected in an epidemic is affected bychanges to a specified parameter via ∂tD/∂p.

The typical difficulty is that, in general, we do not have theexplicit forward solution, in which case explicit expressions forthe desired derivatives are not available.

However, numerical values for these derivatives can becalculated through the numerical solution of the forwardsensitivity equation(s).

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of the Doubling/Tripling Time

Suppose we have an IVP and we are interested in the time ittakes for the solution u = u(t) to double or triple its initial value

i.e., u(tD) = 2u0.

For example, we might wish to know how the doubling time forthe number of people infected in an epidemic is affected bychanges to a specified parameter via ∂tD/∂p.

The typical difficulty is that, in general, we do not have theexplicit forward solution, in which case explicit expressions forthe desired derivatives are not available.

However, numerical values for these derivatives can becalculated through the numerical solution of the forwardsensitivity equation(s).

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of the Doubling/Tripling Time

Suppose we have an IVP and we are interested in the time ittakes for the solution u = u(t) to double or triple its initial value

i.e., u(tD) = 2u0.

For example, we might wish to know how the doubling time forthe number of people infected in an epidemic is affected bychanges to a specified parameter via ∂tD/∂p.

The typical difficulty is that, in general, we do not have theexplicit forward solution, in which case explicit expressions forthe desired derivatives are not available.

However, numerical values for these derivatives can becalculated through the numerical solution of the forwardsensitivity equation(s).

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of the Doubling/Tripling Time

Suppose we have an IVP and we are interested in the time ittakes for the solution u = u(t) to double or triple its initial value

i.e., u(tD) = 2u0.

For example, we might wish to know how the doubling time forthe number of people infected in an epidemic is affected bychanges to a specified parameter via ∂tD/∂p.

The typical difficulty is that, in general, we do not have theexplicit forward solution, in which case explicit expressions forthe desired derivatives are not available.

However, numerical values for these derivatives can becalculated through the numerical solution of the forwardsensitivity equation(s).

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of the Doubling/Tripling Time

Suppose we have an IVP and we are interested in the time ittakes for the solution u = u(t) to double or triple its initial value

i.e., u(tD) = 2u0.

For example, we might wish to know how the doubling time forthe number of people infected in an epidemic is affected bychanges to a specified parameter via ∂tD/∂p.

The typical difficulty is that, in general, we do not have theexplicit forward solution, in which case explicit expressions forthe desired derivatives are not available.

However, numerical values for these derivatives can becalculated through the numerical solution of the forwardsensitivity equation(s).

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of Doubling/Tripling Time

Lemma (Sensitivity of time to attain a multiple of the initialcondition)Let u = u(t; p, u0) be the solution to the first order IVP

dudt

= f (u, t; p) with u(0) = u0,

where f is differentiable in u, t, and p. Let tk denote the time t forwhich u attains the value u(tk) = ku0, where k > 0. The derivativesare dtk/du0 = 0 and dtk/dp is given by

dtkdp

= −

∂u∂p

∣∣∣∣∣t=tk

f (ku0, tk; p).

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of a Critical Point

Determine which parameter(s), of an IVP modeling the spread ofan epidemic has the most effect on the peak of the infection.

In other words, we want to determine the sensitivity of a criticalpoint, to parameters or initial conditions.

Lemma (Sensitivity of Critical Points)The derivative dtcp/dp and dtcp/du0 is given by

dtcp

dp= −

(∂f∂p

+∂f∂u

∂u∂p

) ∣∣∣∣t=tcp

∂f∂u

∣∣∣∣t=tcp

,dtcp

du0= −

∂f∂u

∂u∂u0

∣∣∣∣t=tcp

∂f∂u

∣∣∣∣t=tcp

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of a Critical Point

Determine which parameter(s), of an IVP modeling the spread ofan epidemic has the most effect on the peak of the infection.

In other words, we want to determine the sensitivity of a criticalpoint, to parameters or initial conditions.

Lemma (Sensitivity of Critical Points)The derivative dtcp/dp and dtcp/du0 is given by

dtcp

dp= −

(∂f∂p

+∂f∂u

∂u∂p

) ∣∣∣∣t=tcp

∂f∂u

∣∣∣∣t=tcp

,dtcp

du0= −

∂f∂u

∂u∂u0

∣∣∣∣t=tcp

∂f∂u

∣∣∣∣t=tcp

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of a Critical Point

Determine which parameter(s), of an IVP modeling the spread ofan epidemic has the most effect on the peak of the infection.

In other words, we want to determine the sensitivity of a criticalpoint, to parameters or initial conditions.

Lemma (Sensitivity of Critical Points)The derivative dtcp/dp and dtcp/du0 is given by

dtcp

dp= −

(∂f∂p

+∂f∂u

∂u∂p

) ∣∣∣∣t=tcp

∂f∂u

∣∣∣∣t=tcp

,dtcp

du0= −

∂f∂u

∂u∂u0

∣∣∣∣t=tcp

∂f∂u

∣∣∣∣t=tcp

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of Periodic Solutions to Parameters

A commonly occurring model in the biological and electricalengineering sciences is the nonlinear system of ODEs

dxdt

= y anddydt

= −ω2x + λ(1− x2) y

which is often referred to as van der Pol’s equations. E.g.Belousov–Zhabotinski reaction, model of oscillatory cardiacpacemaker, coupled oscillators in the small intestine, etc.Alternatively, this system can be more conveniently written asthe single second order ODE

d2xdt2 = λ(1− x2)

dxdt− ω2x

Parametric plot of (x(t), y(t)), where ω =√

2, λ = 1,IC’s x(0) = 0.001, y(0) = 0.0, and t ∈ [0, 100].

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of Periodic Solutions to Parameters

A commonly occurring model in the biological and electricalengineering sciences is the nonlinear system of ODEs

dxdt

= y anddydt

= −ω2x + λ(1− x2) y

which is often referred to as van der Pol’s equations. E.g.Belousov–Zhabotinski reaction, model of oscillatory cardiacpacemaker, coupled oscillators in the small intestine, etc.Alternatively, this system can be more conveniently written asthe single second order ODE

d2xdt2 = λ(1− x2)

dxdt− ω2x

Parametric plot of (x(t), y(t)), where ω =√

2, λ = 1,IC’s x(0) = 0.001, y(0) = 0.0, and t ∈ [0, 100].

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of Periodic Solutions to Parameters

A commonly occurring model in the biological and electricalengineering sciences is the nonlinear system of ODEs

dxdt

= y anddydt

= −ω2x + λ(1− x2) y

which is often referred to as van der Pol’s equations. E.g.Belousov–Zhabotinski reaction, model of oscillatory cardiacpacemaker, coupled oscillators in the small intestine, etc.Alternatively, this system can be more conveniently written asthe single second order ODE

d2xdt2 = λ(1− x2)

dxdt− ω2x

Parametric plot of (x(t), y(t)), where ω =√

2, λ = 1,IC’s x(0) = 0.001, y(0) = 0.0, and t ∈ [0, 100].

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of Periodic Solutions to Parameters

The trajectory starts near the origin and it is evident that after asufficient amount of time has elapsed, the solution is convergingto a periodic orbit, or limit cycle.

-2 -1 1 2

-3

-2

-1

1

2

3

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of Periodic Solutions to Parameters

Consider the IVP where the forward solution u approaches alimit cycle of period T as t→∞

u(t + T ; u0, u0′, p) = u(t; u0, u0

′, p), ∀t ∈ [0,∞).

As is almost aways the case, a closed form of the forwardsolution is not available, in which case the derivative ∂T /∂p cannot be explicitly obtained.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of Periodic Solutions to Parameters

Consider the IVP where the forward solution u approaches alimit cycle of period T as t→∞

u(t + T ; u0, u0′, p) = u(t; u0, u0

′, p), ∀t ∈ [0,∞).

As is almost aways the case, a closed form of the forwardsolution is not available, in which case the derivative ∂T /∂p cannot be explicitly obtained.

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of Periodic Solutions to Parameters

The derivative ∂T /∂p is given by the following

Lemma (Sensitivity of a periodic function)Let u = u(t; u0, u0

′, p) be a family of periodic functions with periodT . The derivative of the period T with respect to the parameter p isgiven by

dTdp

=

∂u(t; u0, u0′, p)

∂p− ∂u(s; u0, u0

′, p)∂p

∣∣∣∣∣s=t+T

∂u(t; u0, u0′, p)

∂t

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

Sensitivity of Periodic Solutions to Parameters

The derivative ∂T /∂p is given by the following

Lemma (Sensitivity of a periodic function)Let u = u(t; u0, u0

′, p) be a family of periodic functions with periodT . The derivative of the period T with respect to the parameter p isgiven by

dTdp

=

∂u(t; u0, u0′, p)

∂p− ∂u(s; u0, u0

′, p)∂p

∣∣∣∣∣s=t+T

∂u(t; u0, u0′, p)

∂t

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

The End

Thank You!Any questions?

Leon Arriola & James Hyman Being Sensitive to Uncertainty!

The End

Thank You!Any questions?

Leon Arriola & James Hyman Being Sensitive to Uncertainty!