# processi stocastici: richiami...

Post on 20-Feb-2021

5 views

Embed Size (px)

TRANSCRIPT

Corso di Modelli di Computazione Affettiva

Prof. Giuseppe Boccignone

Dipartimento di Informatica

Università di Milano

http://boccignone.di.unimi.it/CompAff2016.html

Processi stocastici: richiami generali

Dove siamo arrivati // Modelli

• Teorie: Basi psicologiche e neurobiologiche delle emozioni

• Modelli: Modelli probabilistici per la computazione affettiva:

• Apprendimento

• Inferenza

• Applicazioni: Esempi di sistemi di computazione affettiva

• Modello

stato mentale

(nascosto)

comportamento non verbale (visibile)

generazione inferenza

Dove siamo arrivati // Modelli

Teoria computazionale

Rappresentazione e algoritmo

Implementazione hardware

Modelli nelle scienze cognitive //livelli di spiegazione

stato mentale

(nascosto)

comportamento non verbale (visibile)

generazione inferenza

Modelli probabilistici

Algoritmi di inferenza e learning

Implementazione hardware

Modelli nelle scienze cognitive //livelli di spiegazione

X

Y

I modelli considerano processi che evolvono nel tempo...

t

t

evento emotigeno

emozionesegnali non verbali

misurabili nel tempo

I modelli considerano processi che evolvono nel tempo...

Processi stocastici // Definizione

We assume here that coins and dice are fair and have no memory, i.e., eachoutcome is equally likely on each toss, regardless of the results of previous tosses.

It is helpful to give a geometric representation of events using a Venn diagram.This is a diagram in which sample space is presented using a closed-plane figureand sample points using the corresponding dots. The sample spaces (1.1) and (1.3)are shown in Fig. 1.1a, b, respectively.

The sample sets (1.1) and (1.3) are discrete and finite. The sample set can also bediscrete and infinite. If the elements of the sample set are continuous (i.e., notcountable) thus the sample set S is continuous. For example, in an experimentwhich measures voltage over time T, the sample set (Fig. 1.2) is:

S ¼ fsjV1 < s < V2g: (1.4)

In most situations, we are not interested in the occurrence of specific outcomes,but rather in certain characteristics of the outcomes. For example, in the voltagemeasurement experiment we might be interested if the voltage is positive or lessthan some desired value V. To handle such situations it is useful to introduce theconcept of an event.

Fig. 1.1 Sample spaces for coin tossing and die rolling. (a) Coin tossing. (b) Die rolling

V1 V2

S

Fig. 1.2 Example of continuous space

2 1 Introduction to Sample Space and Probability

Sample space

Chapter 1Introduction to Sample Space and Probability

1.1 Sample Space and Events

Probability theory is the mathematical analysis of random experiments [KLI86,p. 11]. An experiment is a procedure we perform that produces some result oroutcome [MIL04, p. 8].

An experiment is considered random if the result of the experiment cannot bedetermined exactly. Although the particular outcome of the experiment is notknown in advance, let us suppose that all possible outcomes are known.

The mathematical description of the random experiment is given in terms of:

• Sample space• Events• Probability

The set of all possible outcomes is called sample space and it is given the symbolS. For example, in the experiment of a coin-tossing we cannot predict exactly if“head,” or “tail” will appear; but we know that all possible outcomes are the“heads,” or “tails,” shortly abbreviated as H and T, respectively. Thus, the samplespace for this random experiment is:

S ¼ fH;Tg: (1.1)

Each element in S is called a sample point, si. Each outcome is represented by acorresponding sample point. For example, the sample points in (1.1) are:

s1 ¼ H; s2 ¼ T: (1.2)

When rolling a die, the outcomes correspond to the numbers of dots (1–6).Consequently, the sample set in this experiment is:

S ¼ f1; 2; 3; 4; 5; 6g: (1.3)

G.J. Dolecek, Random Signals and Processes Primer with MATLAB,DOI 10.1007/978-1-4614-2386-7_1, # Springer Science+Business Media New York 2013

1

s1= s2=

Processi stocastici // Definizione

We assume here that coins and dice are fair and have no memory, i.e., eachoutcome is equally likely on each toss, regardless of the results of previous tosses.

It is helpful to give a geometric representation of events using a Venn diagram.This is a diagram in which sample space is presented using a closed-plane figureand sample points using the corresponding dots. The sample spaces (1.1) and (1.3)are shown in Fig. 1.1a, b, respectively.

The sample sets (1.1) and (1.3) are discrete and finite. The sample set can also bediscrete and infinite. If the elements of the sample set are continuous (i.e., notcountable) thus the sample set S is continuous. For example, in an experimentwhich measures voltage over time T, the sample set (Fig. 1.2) is:

S ¼ fsjV1 < s < V2g: (1.4)

In most situations, we are not interested in the occurrence of specific outcomes,but rather in certain characteristics of the outcomes. For example, in the voltagemeasurement experiment we might be interested if the voltage is positive or lessthan some desired value V. To handle such situations it is useful to introduce theconcept of an event.

Fig. 1.1 Sample spaces for coin tossing and die rolling. (a) Coin tossing. (b) Die rolling

V1 V2

S

Fig. 1.2 Example of continuous space

2 1 Introduction to Sample Space and Probability

Sample space

Chapter 1Introduction to Sample Space and Probability

1.1 Sample Space and Events

Probability theory is the mathematical analysis of random experiments [KLI86,p. 11]. An experiment is a procedure we perform that produces some result oroutcome [MIL04, p. 8].

An experiment is considered random if the result of the experiment cannot bedetermined exactly. Although the particular outcome of the experiment is notknown in advance, let us suppose that all possible outcomes are known.

The mathematical description of the random experiment is given in terms of:

• Sample space• Events• Probability

The set of all possible outcomes is called sample space and it is given the symbolS. For example, in the experiment of a coin-tossing we cannot predict exactly if“head,” or “tail” will appear; but we know that all possible outcomes are the“heads,” or “tails,” shortly abbreviated as H and T, respectively. Thus, the samplespace for this random experiment is:

S ¼ fH;Tg: (1.1)

Each element in S is called a sample point, si. Each outcome is represented by acorresponding sample point. For example, the sample points in (1.1) are:

s1 ¼ H; s2 ¼ T: (1.2)

When rolling a die, the outcomes correspond to the numbers of dots (1–6).Consequently, the sample set in this experiment is:

S ¼ f1; 2; 3; 4; 5; 6g: (1.3)

G.J. Dolecek, Random Signals and Processes Primer with MATLAB,DOI 10.1007/978-1-4614-2386-7_1, # Springer Science+Business Media New York 2013

1

0 1

XVariabile aleatoria

X (discreta)

x Realizzazioni x della v.a. X

s1= s2=

x1= x2=

Processi stocastici // Definizione

• Generazione di una variabile aleatoria (Bernoulliana):

X = (rand

• Serie temporale o sample path

X = (rand

Processi stocastici // Definizione

Weassumehere

that

coinsanddice

arefair

andhave

nomem

ory,

i.e.,each

outcom

eisequallylikely

oneach

toss,regardless

oftheresultsof

previous

tosses.

Itishelpfulto

give

ageom

etricrepresentation

ofeventsusingaVenndiagram.

Thisisadiagram

inwhich

samplespaceispresentedusingaclosed-plane

figure

andsamplepointsusingthecorrespondingdots. T

hesamplespaces

(1.1)and(1.3)

areshow

nin

Fig.1.1a,b,

respectively.

The

samplesets(1.1)and(1.3)arediscreteandfinite.The

sampleset can

also

be

discrete

andinfinite.

Iftheelem

ents

ofthesamplesetarecontinuous

(i.e.,not

countable)

thus

thesamplesetSis

continuous.For

exam

ple,

inan

experiment

which

measuresvoltageover

timeT,thesampleset(Fig.1.2)

is:

S¼

fsjV

1<

s<

V2g:

(1.4)

Inmostsituations,wearenotinterested

intheoccurrence

ofspecificoutcom

es,

butrather

incertaincharacteristicsof

theoutcom

es.For

exam

ple,

inthevoltage

measurementexperimentwemight

beinterested

ifthevoltageispositive

orless

than

somedesiredvalueV.Tohandle

such

situations

itisuseful

tointroducethe

conceptof

anevent.

Fig. 1.1

Sam

plespaces

forcoin

tossinganddierolling.(a)Cointossing.(b)Dierolling

V1

V2

SFig. 1.2

Exampleof

continuous

space

2

1Introduction

toSam

pleSpace

andProbability

X(t,S

)

s 1=

s 2=

X j (t)

X i (t)

Processo stocastico

…

Processi stocastici // Definizione

6.1.1 Deterministic and Nondeterministic Random Processes

A random process is said to be deterministic if all future values of any realizationcan be determined from its past values [PEE93, p. 166]. As an example, considersinusoidal realizations where the amplitude varies for different realizations, asshown in Fig. 6.2a. The future values of each realization are shown with dottedlines and are completely known. In the opposite case, shown in Fig. 6.2.b, theprocess is a nondeterministic. The waveforms of realizations are irregular andcannot be described using a mathematical expression. Consequently, futurevalues–shown with dotted lines–cannot be precisely determined.

From the previous discussion, note that the term “random” in a random processis not related with a wave shape of its realizations, but with an uncertainty in whichthe time function is assigned to a particular outcome. Similarly, as in a die rollingexperiment, one knows with certainty that a number from 1 to 6 will appear, andthe uncertainty is which of those numbers will appear in each roll.

6.1.2 Continuous and Discrete Random Processes

The classification of random processes may be performed depending on whetheramplitudes and time are continuous or discrete values. As both amplitude and timecan be either continuous or discrete, all possible combinations are shown in Fig. 6.3.

Fig. 6.1 Mapping a space of events S to a (x, y) space

372 6 Random Processes

Processo stocastico

• Processo stocastico

Processi stocastici // Definizione

354 CHAPTER 10 STOCHASTIC PROCESSES

x t,s( )1

x t,s( )2

x t,s( )3

s1

s2

s3

SAMPLE SPACE SAMPLE FUNCTIONS

Figure 10.1 Conceptual representation of a random process.

Definition 10.1 Stochastic ProcessA stochastic process X (t) consists of an experiment with a probability measure P[·] definedon a sample space S and a function that assigns a time function x(t, s) to each outcome sin the sample space of the experiment.

Essentially, the definition says that the outcomes of the experiment are all functions oftime. Just as a random variable assigns a number to each outcome s in a sample space S, astochastic process assigns a sample function to each outcome s.

Definition 10.2 Sample FunctionA sample function x(t, s) is the time function associated with outcome s of an experiment.

A sample function corresponds to an outcome of a stochastic process experiment. It isone of the possible time functions that can result from the experiment. Figure 10.1 showsthe correspondence between the sample space of an experiment and the ensemble of samplefunctions of a stochastic process. It also displays the two-dimensional notation for samplefunctions x(t, s). In this notation, X (t) is the name of the stochastic process, s indicates theparticular outcome of the experiment, and t indicates the time dependence. Correspondingto the sample space of an experiment and to the range of a random variable, we define theensemble of a stochastic process.

Definition 10.3 EnsembleThe ensemble of a stochastic process is the set of all possible time functions that can resultfrom an experiment.

P1: TIX/XYZ P2: ABCJWST172-c01 JWST172-Ruggeri March 3, 2012 15:53

1

Stochastic processes

1.1 IntroductionThe theme of this book is Bayesian Analysis of Stochastic Process Models. In thisfirst chapter, we shall provide the basic concepts needed in defining and analyzingstochastic processes. In particular, we shall review what stochastic processes are,their most important characteristics, the important classes of processes that shall beanalyzed in later chapters, and the main inference and decision-making tasks that weshall be facing. We also set up the basic notation that will be followed in the restof the book. This treatment is necessarily brief, as we cover material which is wellknown from, for example, the texts that we provide in our final discussion.

1.2 Key concepts in stochastic processesStochastic processes model systems that evolve randomly in time, space or space-time. This evolution will be described through an index t ∈ T . Consider a randomexperiment with sample space !, endowed with a σ -algebra F and a base probabilitymeasure P . Associating numerical values with the elements of that space, we maydefine a family of random variables {Xt , t ∈ T }, which will be a stochastic process.This idea is formalized in our first definition that covers our object of interest inthis book.

Definition 1.1: A stochastic process {Xt , t ∈ T } is a collection of random variablesXt, indexed by a set T, taking values in a common measurable space S endowed withan appropriate σ -algebra.

T could be a set of times, when we have a temporal stochastic process; a set ofspatial coordinates, when we have a spatial process; or a set of both time and spatialcoordinates, when we deal with a spatio-temporal process. In this book, in general,

Bayesian Analysis of Stochastic Process Models, First Edition. David Rios Insua, Fabrizio Ruggeri and Michael P. Wiper.© 2012 John Wiley & Sons, Ltd. Published 2012 by John Wiley & Sons, Ltd.

3

L’ “output” dell’esperimento è un ensemble di funzioni nel tempo

• Processo stocastico

Processi stocastici // Definizione

354 CHAPTER 10 STOCHASTIC PROCESSES

x t,s( )1

x t,s( )2

x t,s( )3

s1

s2

s3

SAMPLE SPACE SAMPLE FUNCTIONS

Figure 10.1 Conceptual representation of a random process.

Definition 10.1 Stochastic ProcessA stochastic process X (t) consists of an experiment with a probability measure P[·] definedon a sample space S and a function that assigns a time function x(t, s) to each outcome sin the sample space of the experiment.

Essentially, the definition says that the outcomes of the experiment are all functions oftime. Just as a random variable assigns a number to each outcome s in a sample space S, astochastic process assigns a sample function to each outcome s.

Definition 10.2 Sample FunctionA sample function x(t, s) is the time function associated with outcome s of an experiment.

A sample function corresponds to an outcome of a stochastic process experiment. It isone of the possible time functions that can result from the experiment. Figure 10.1 showsthe correspondence between the sample space of an experiment and the ensemble of samplefunctions of a stochastic process. It also displays the two-dimensional notation for samplefunctions x(t, s). In this notation, X (t) is the name of the stochastic process, s indicates theparticular outcome of the experiment, and t indicates the time dependence. Correspondingto the sample space of an experiment and to the range of a random variable, we define theensemble of a stochastic process.

Definition 10.3 EnsembleThe ensemble of a stochastic process is the set of all possible time functions that can resultfrom an experiment.

Variabile aleatoria

Serie temporale

• Processi stocastici:

Processi stocastici // Definizione

358 CHAPTER 10 STOCHASTIC PROCESSES

Continuous-Time, Continuous-Value Discrete-Time, Continuous-Value

−1 −0.5 0 0.5 1

−2

0

2

t

Xcc

(t)

−1 −0.5 0 0.5 1

−2

0

2

t

Xdc

(t)

Continuous-Time, Discrete-Value Discrete-Time, Discrete-Value

−1 −0.5 0 0.5 1

−2

0

2

t

Xcd

(t)

−1 −0.5 0 0.5 1

−2

0

2

t

Xdd

(t)

Figure 10.3 Sample functions of four kinds of stochastic processes. Xcc(t) is a continuous-time,continuous-value process. Xdc(t) is discrete-time, continuous-value process obtained by samplingXcc) every 0.1 seconds. Rounding Xcc(t) to the nearest integer yields Xcd (t), a continuous-time,discrete-value process. Lastly, Xdd (t), a discrete-time, discrete-value process, can be obtained eitherby sampling Xcd (t) or by rounding Xdc(t).

Definition 10.4 Discrete-Value and Continuous-Value ProcessesX (t) is a discrete-value process if the set of all possible values of X (t) at all times t is acountable set SX ; otherwise X (t) is a continuous-value process.

Definition 10.5 Discrete-Time and Continuous-Time ProcessesThe stochastic process X (t) is a discrete-time process if X (t) is defined only for a set oftime instants, tn = nT , where T is a constant and n is an integer; otherwise X (t) is acontinuous-time process.

In Figure 10.3, we see that the combinations of continuous/discrete time and continu-ous/discrete value result in four categories. For a discrete-time process, the sample functionis completely described by the ordered sequence of random variables X n = X (nT ).

Definition 10.6 Random SequenceA random sequence X n is an ordered sequence of random variables X 0, X1, . . ..

Quiz 10.2 For the temperature measurements of Example 10.3, construct examples of the measurementprocess such that the process is either(1) discrete-time, discrete-value, (2) discrete-time, continuous-value,

358 CHAPTER 10 STOCHASTIC PROCESSES

Continuous-Time, Continuous-Value Discrete-Time, Continuous-Value

−1 −0.5 0 0.5 1

−2

0

2

t

Xcc

(t)

−1 −0.5 0 0.5 1

−2

0

2

t

Xdc

(t)

Continuous-Time, Discrete-Value Discrete-Time, Discrete-Value

−1 −0.5 0 0.5 1

−2

0

2

t X

cd(t)

−1 −0.5 0 0.5 1

−2

0

2

t

Xdd

(t)

Figure 10.3 Sample functions of four kinds of stochastic processes. Xcc(t) is a continuous-time,continuous-value process. Xdc(t) is discrete-time, continuous-value process obtained by samplingXcc) every 0.1 seconds. Rounding Xcc(t) to the nearest integer yields Xcd (t), a continuous-time,discrete-value process. Lastly, Xdd (t), a discrete-time, discrete-value process, can be obtained eitherby sampling Xcd (t) or by rounding Xdc(t).

Definition 10.4 Discrete-Value and Continuous-Value ProcessesX (t) is a discrete-value process if the set of all possible values of X (t) at all times t is acountable set SX ; otherwise X (t) is a continuous-value process.

Definition 10.5 Discrete-Time and Continuous-Time ProcessesThe stochastic process X (t) is a discrete-time process if X (t) is defined only for a set oftime instants, tn = nT , where T is a constant and n is an integer; otherwise X (t) is acontinuous-time process.

In Figure 10.3, we see that the combinations of continuous/discrete time and continu-ous/discrete value result in four categories. For a discrete-time process, the sample functionis completely described by the ordered sequence of random variables X n = X (nT ).

Definition 10.6 Random SequenceA random sequence X n is an ordered sequence of random variables X 0, X1, . . ..

Quiz 10.2 For the temperature measurements of Example 10.3, construct examples of the measurementprocess such that the process is either(1) discrete-time, discrete-value, (2) discrete-time, continuous-value,

6.2 Statistics of Random Processes

6.2.1 Description of a Process in One Point

We have already mentioned that a random process in a specified time t becomes arandom variable. Let us consider the time instant t1. A random variable defined in a

Fig. 6.2 Deterministic and nondeterministic random processes

Fig. 6.3 Discrete and continuous random processes

6.2 Statistics of Random Processes 373

VALUE

Processi stocastici // Definizione

• Descrizione del processo in un punto: fissato il tempo considero la v.a.

time instant t1 is a random variable X1. The distribution function of the variable,denoted as FX1ðx1; t1Þ is defined as:

FX1ðx1; t1Þ ¼ PfX1 $ x1; t1g: (6.1)

Note that the definition (6.1) is the same as the definition of distribution of arandom variable, with the only difference being that the distribution (6.1) dependson the time instant t1.

The meaning of the distribution (6.1) is explained in Fig. 6.4. For the specifiedtime instant t1 and value x1, the distribution (6.1) presents the probability that in theinstant t1 all realizations are less than or equal to a value x1. This probability canalso be interpreted using a frequency ratio, as described below.

Consider that a random experiment is performed n times and a particularrealization is given to each outcome, as shown in Fig. 6.4. Let n(x1, t1) be the totalnumber of successes where the amplitudes of realizations in the time instant t1 arenot more than x1.

Taking n to be very large, the desired probability can be approximated as:

FX1ðx1; t1jÞ %nðx1; t1Þ

n: (6.2)

The distribution (6.2) is called a distribution of the first order, or a one-dimensionaldistribution of a processX(t). Note that the one-dimensional distribution is obtained byobserving a process in one time instant.

Fig. 6.4 Description of a process in a particular time instant

374 6 Random Processes

time instant t1 is a random variable X1. The distribution function of the variable,denoted as FX1ðx1; t1Þ is defined as:

FX1ðx1; t1Þ ¼ PfX1 $ x1; t1g: (6.1)

Note that the definition (6.1) is the same as the definition of distribution of arandom variable, with the only difference being that the distribution (6.1) dependson the time instant t1.

The meaning of the distribution (6.1) is explained in Fig. 6.4. For the specifiedtime instant t1 and value x1, the distribution (6.1) presents the probability that in theinstant t1 all realizations are less than or equal to a value x1. This probability canalso be interpreted using a frequency ratio, as described below.

Consider that a random experiment is performed n times and a particularrealization is given to each outcome, as shown in Fig. 6.4. Let n(x1, t1) be the totalnumber of successes where the amplitudes of realizations in the time instant t1 arenot more than x1.

Taking n to be very large, the desired probability can be approximated as:

FX1ðx1; t1jÞ %nðx1; t1Þ

n: (6.2)

The distribution (6.2) is called a distribution of the first order, or a one-dimensionaldistribution of a processX(t). Note that the one-dimensional distribution is obtained byobserving a process in one time instant.

Fig. 6.4 Description of a process in a particular time instant

374 6 Random Processestime instant t1 is a random variable X1. The distribution function of the variable,denoted as FX1ðx1; t1Þ is defined as:

FX1ðx1; t1Þ ¼ PfX1 $ x1; t1g: (6.1)

Note that the definition (6.1) is the same as the definition of distribution of arandom variable, with the only difference being that the distribution (6.1) dependson the time instant t1.

The meaning of the distribution (6.1) is explained in Fig. 6.4. For the specifiedtime instant t1 and value x1, the distribution (6.1) presents the probability that in theinstant t1 all realizations are less than or equal to a value x1. This probability canalso be interpreted using a frequency ratio, as described below.

Consider that a random experiment is performed n times and a particularrealization is given to each outcome, as shown in Fig. 6.4. Let n(x1, t1) be the totalnumber of successes where the amplitudes of realizations in the time instant t1 arenot more than x1.

Taking n to be very large, the desired probability can be approximated as:

FX1ðx1; t1jÞ %nðx1; t1Þ

n: (6.2)

The distribution (6.2) is called a distribution of the first order, or a one-dimensionaldistribution of a processX(t). Note that the one-dimensional distribution is obtained byobserving a process in one time instant.

Fig. 6.4 Description of a process in a particular time instant

374 6 Random Processes

time instant t1 is a random variable X1. The distribution function of the variable,denoted as FX1ðx1; t1Þ is defined as:

FX1ðx1; t1Þ ¼ PfX1 $ x1; t1g: (6.1)

Taking n to be very large, the desired probability can be approximated as:

FX1ðx1; t1jÞ %nðx1; t1Þ

n: (6.2)

Fig. 6.4 Description of a process in a particular time instant

374 6 Random Processes

FX1ðx1; t1Þ ¼ PfX1 $ x1; t1g: (6.1)

Taking n to be very large, the desired probability can be approximated as:

FX1ðx1; t1jÞ %nðx1; t1Þ

n: (6.2)

Fig. 6.4 Description of a process in a particular time instant

374 6 Random Processes

Funzione di distribuzione

A density function is obtained as the derivation of the corresponding distributionfunction,

fX1ðx1; t1Þ ¼@FX1ðx1; t1Þ

@x1¼ Pfx1

Processi stocastici // Definizione

• Descrizione del processo in due punti:

Funzione di distribuzione

Funzione di densità

6.2.3 Description of Process in n Points

The previous discussion can be generalized by observing a process in n points.Obviously, in this way we can obtain a more detailed description of a process.However, the resulting description will be highly complex.

By observing a process in n points, t1, . . ., tn, we get n random variables:

X1 ¼ Xðt1Þ;X2 ¼ Xðt2Þ; . . . ;Xn ¼ XðtnÞ: (6.7)

We define a joint distribution function of nth order, as:

FX1X2:::Xnðx1; x2; . . . ; xn; t1; t2; . . . ; tnÞ¼ P Xðt1Þ $ x1;Xðt2Þ $ x2; . . . ;XðtnÞ $ xnf g¼ PfX1 $ x1;X2 $ x2; . . . ;Xn $ xn; t1; t2; . . . ; tng:

(6.8)

The corresponding joint density function of nth order is given as:

fX1X2;...;Xnðx1; x2; . . . ; xn; t1; t2; : . . . tnÞ ¼@nFX1X2ðx1; x2; . . . ; xn; t1; t2; . . . ; tnÞ

@[email protected]:::@xn

¼ P x1

Processi stocastici // Definizione

• Attenzione!: con qualche abuso di notazione, ma per semplificare, indicheremo nel seguito la densità congiunta (pdf) come nel caso discreto

Funzione di densità

6.2.3 Description of Process in n Points

The previous discussion can be generalized by observing a process in n points.Obviously, in this way we can obtain a more detailed description of a process.However, the resulting description will be highly complex.

By observing a process in n points, t1, . . ., tn, we get n random variables:

X1 ¼ Xðt1Þ;X2 ¼ Xðt2Þ; . . . ;Xn ¼ XðtnÞ: (6.7)

We define a joint distribution function of nth order, as:

FX1X2:::Xnðx1; x2; . . . ; xn; t1; t2; . . . ; tnÞ¼ P Xðt1Þ $ x1;Xðt2Þ $ x2; . . . ;XðtnÞ $ xnf g¼ PfX1 $ x1;X2 $ x2; . . . ;Xn $ xn; t1; t2; . . . ; tng:

(6.8)

The corresponding joint density function of nth order is given as:

fX1X2;...;Xnðx1; x2; . . . ; xn; t1; t2; : . . . tnÞ ¼@nFX1X2ðx1; x2; . . . ; xn; t1; t2; . . . ; tnÞ

@[email protected]:::@xn

¼ P x1

Processi stocastici // Dinamica

• Evoluzione del processo: Per caratterizzare la dinamica del processo, ovvero nel caso temporale le probabilità di transizione

• Queste sono specificate come probabilità condizionata

Table of content 23

where P(1, · · · ,n)dxk+1dxk+2 · · · dxn stands for the joint probability of finding thatx has a certain value

xk+1 < x ≤ xk+1 + dxk+1 at time tk+1xk+2 < x ≤ xk+2 + dxk+2 at time tk+2

· · ·For instance, referring to Figure 13, we can calculate the joint probability

P(1,2)dx1dx2 by following the vertical line at t1 and t2 and find the fraction ofpaths for which x(t1) = x1 within tolerance dx1 and x(t2) = x2 within tolerancedx2, respectively3

Summing up, the joint probability density function, written explicitly as

P(x1, t1; x2, t2; · · · ; xn , tn),

is all we need to fully characterise the statistical properties of a stochastic processand to calculate the quantities of interest characterising the process (see Table 6).

The dynamics, or evolution of a stochastic process can be represented throughthe specification of transition probabilities:

P(2 | 1) : probability of finding 2, when 1 is given;P(3 | 1,2) : probability of finding 3, when 1 and 2 are given;

P(4 | 1,2,3) : probability of finding 4, when 1,2 and 3 are given;· · ·

Transition probabilities for a stochastic process are nothing but the conditionalprobabilities suitable to predict the future values of X(t) (i.e., xk+1,xk+2, · · · xk+l ,at tk+1, tk+2, · · · tk+l ), given the knowledge of the past (x1,x2, · · · xk , at t1, t2, · · · tk ).The conditional pdf explicitly defined in terms of the joint pdf can be written:

P(future︷!!!!!!!!!!!!!!!!!!!!!!!!!!!!︸︸!!!!!!!!!!!!!!!!!!!!!!!!!!!!︷

xk+1, tk+1; · · · ; xk+l , tk+l | x1, t1; · · · ; xk , tk︸!!!!!!!!!!!!!!︷︷!!!!!!!!!!!!!!︸past

) =P(x1, t1; · · · ; xk+l , tk+l )

P(x1, t1; · · · ; xk , tk ).

(25)assuming the time ordering t1 < t2 < · · · < tk < tk+1 < · · · < tk+l .

By using transition probabilities and the product rule, the following update equa-tions can be written:

P(1,2) = P(2 | 1)P(1) (26)P(1,2,3) = P(3 | 1,2)P(1,2)

P(1,2,3,4) = P(4 | 1,2,3)P(1,2,3)· · ·

The transition probabilities must satisfy the normalisation condition∫

P(2 |1)dx2 = 1. Since P(2) =

∫P(1,2)dx1 and by using the update eqs. (27), the

following evolution (integral) equation holds

3 This gives an intuitive insight into the notion of P(1, 2) as a density.

Abbiamo usato la notazione abbreviata di Kerson Huang

22 Table of content

Fig. 13 An ensemble of paths representing a stochastic process. Each path represents the sequencein time of raw x coordinates from different scanpaths recorded on the same picture (cfr. Fig. 12).We can conceive the trajectories of such ensemble as realisations of a stochastic process.

at times t1, t2, t3, · · ·. The setS whose elements are the values of the process is calledstate space.

Thus, we can conceive the stochastic process X(t) as an ensemble of paths asshown in Figure 3 or, more simply, as illustrated in Figure 13: here, for concreteness,we show four series of only the raw x coordinates of different eye-tracked subjectsgazing at picture shown in Figure 3. Note that if we fix the time, e.g., t = t1,then X(t1) boils down to a RV (vertical values); the same holds if we choose onepath x and we (horizontally) consider the set of values x1, x2, x3, · · · , at timest1, t2, t3, · · ·.

Use Huang’s abbreviation [50]

k ↔ {xk , tk }

To describe the process completely we need to know the correlations in time, thatis the hierarchy of pdfs (but see Table 6, for a discussion of correlation):

P(1) : the 1 point pdf (17)P(1,2) : the 2 points pdf

P(1,2,3) : the 3 points pdf· · ·

up to the n point joint pdf. The n point joint pdf must imply all the lower k pointpdfs, k < n:

P(1, · · · , k) =∫

P(1, · · · ,n)dxk+1dxk+2 · · · dxn (18)

Table of content 23

where P(1, · · · ,n)dxk+1dxk+2 · · · dxn stands for the joint probability of finding thatx has a certain value

xk+1 < x ≤ xk+1 + dxk+1 at time tk+1xk+2 < x ≤ xk+2 + dxk+2 at time tk+2

· · ·For instance, referring to Figure 13, we can calculate the joint probability

P(1,2)dx1dx2 by following the vertical line at t1 and t2 and find the fraction ofpaths for which x(t1) = x1 within tolerance dx1 and x(t2) = x2 within tolerancedx2, respectively3

Summing up, the joint probability density function, written explicitly as

P(x1, t1; x2, t2; · · · ; xn , tn),

is all we need to fully characterise the statistical properties of a stochastic processand to calculate the quantities of interest characterising the process (see Table 6).

The dynamics, or evolution of a stochastic process can be represented throughthe specification of transition probabilities:

P(2 | 1) : probability of finding 2, when 1 is given;P(3 | 1,2) : probability of finding 3, when 1 and 2 are given;

P(4 | 1,2,3) : probability of finding 4, when 1,2 and 3 are given;· · ·

Transition probabilities for a stochastic process are nothing but the conditionalprobabilities suitable to predict the future values of X(t) (i.e., xk+1,xk+2, · · · xk+l ,at tk+1, tk+2, · · · tk+l ), given the knowledge of the past (x1,x2, · · · xk , at t1, t2, · · · tk ).The conditional pdf explicitly defined in terms of the joint pdf can be written:

P(future︷!!!!!!!!!!!!!!!!!!!!!!!!!!!!︸︸!!!!!!!!!!!!!!!!!!!!!!!!!!!!︷

xk+1, tk+1; · · · ; xk+l , tk+l | x1, t1; · · · ; xk , tk︸!!!!!!!!!!!!!!︷︷!!!!!!!!!!!!!!︸past

) =P(x1, t1; · · · ; xk+l , tk+l )

P(x1, t1; · · · ; xk , tk ).

(25)assuming the time ordering t1 < t2 < · · · < tk < tk+1 < · · · < tk+l .

By using transition probabilities and the product rule, the following update equa-tions can be written:

P(1,2) = P(2 | 1)P(1) (26)P(1,2,3) = P(3 | 1,2)P(1,2)

P(1,2,3,4) = P(4 | 1,2,3)P(1,2,3)· · ·

The transition probabilities must satisfy the normalisation condition∫

P(2 |1)dx2 = 1. Since P(2) =

∫P(1,2)dx1 and by using the update eqs. (27), the

following evolution (integral) equation holds

3 This gives an intuitive insight into the notion of P(1, 2) as a density.

Table of content 23

where P(1, · · · ,n)dxk+1dxk+2 · · · dxn stands for the joint probability of finding thatx has a certain value

xk+1 < x ≤ xk+1 + dxk+1 at time tk+1xk+2 < x ≤ xk+2 + dxk+2 at time tk+2

· · ·For instance, referring to Figure 13, we can calculate the joint probability

P(1,2)dx1dx2 by following the vertical line at t1 and t2 and find the fraction ofpaths for which x(t1) = x1 within tolerance dx1 and x(t2) = x2 within tolerancedx2, respectively3

Summing up, the joint probability density function, written explicitly as

P(x1, t1; x2, t2; · · · ; xn , tn),

is all we need to fully characterise the statistical properties of a stochastic processand to calculate the quantities of interest characterising the process (see Table 6).

The dynamics, or evolution of a stochastic process can be represented throughthe specification of transition probabilities:

P(2 | 1) : probability of finding 2, when 1 is given;P(3 | 1,2) : probability of finding 3, when 1 and 2 are given;

P(4 | 1,2,3) : probability of finding 4, when 1,2 and 3 are given;· · ·

Transition probabilities for a stochastic process are nothing but the conditionalprobabilities suitable to predict the future values of X(t) (i.e., xk+1,xk+2, · · · xk+l ,at tk+1, tk+2, · · · tk+l ), given the knowledge of the past (x1,x2, · · · xk , at t1, t2, · · · tk ).The conditional pdf explicitly defined in terms of the joint pdf can be written:

P(future︷!!!!!!!!!!!!!!!!!!!!!!!!!!!!︸︸!!!!!!!!!!!!!!!!!!!!!!!!!!!!︷

xk+1, tk+1; · · · ; xk+l , tk+l | x1, t1; · · · ; xk , tk︸!!!!!!!!!!!!!!︷︷!!!!!!!!!!!!!!︸past

) =P(x1, t1; · · · ; xk+l , tk+l )

P(x1, t1; · · · ; xk , tk ).

(25)assuming the time ordering t1 < t2 < · · · < tk < tk+1 < · · · < tk+l .

By using transition probabilities and the product rule, the following update equa-tions can be written:

P(1,2) = P(2 | 1)P(1) (26)P(1,2,3) = P(3 | 1,2)P(1,2)

P(1,2,3,4) = P(4 | 1,2,3)P(1,2,3)· · ·

The transition probabilities must satisfy the normalisation condition∫

P(2 |1)dx2 = 1. Since P(2) =

∫P(1,2)dx1 and by using the update eqs. (27), the

following evolution (integral) equation holds

3 This gives an intuitive insight into the notion of P(1, 2) as a density.

Processi stocastici // Descrizione di sintesi di un processo

• Funzioni descrittive: non lo caratterizzano ma ne forniscono una rappresentazione sintetica

• Dato il processo definiamo le funzioni

• media

P1: TIX/XYZ P2: ABCJWST172-c01 JWST172-Ruggeri March 3, 2012 15:53

4 BAYESIAN ANALYSIS OF STOCHASTIC PROCESS MODELS

we shall focus on stochastic processes indexed by time, and will call T the spaceof times. When T is discrete, we shall say that the process is in discrete time andwill denote time through n and represent the process through {Xn, n = 0, 1, 2, . . .} .When T is continuous, we shall say that the process is in continuous time. We shallusually assume that T = [0,∞) in this case. The values adopted by the process willbe called the states of the process and will belong to the state space S. Again, S maybe either discrete or continuous.

At least two visions of a stochastic process can be given. First, for each ω ∈ ",we may rewrite Xt (ω) = gω(t) and we have a function of t which is a realization ora sample function of the stochastic process and describes a possible evolution of theprocess through time. Second, for any given t, Xt is a random variable. To completelydescribe the stochastic process, we need a joint description of the family of randomvariables {Xt , t ∈ T }, not just the individual random variables. To do this, we mayprovide a description based on the joint distribution of the random variables at anydiscrete subset of times, that is, for any {t1, . . . , tn} with t1 < · · · < tn , and for any{x1, . . . , xn}, we provide

P(Xt1 ≤ x1, . . . , Xtn ≤ xn

).

Appropriate consistency conditions over these finite-dimensional families of dis-tributions will ensure the definition of the stochastic process, via the Kolmogorovextension theorem, as in, for example, Øksendal (2003).

Theorem 1.1: Let T ⊆ [0,∞). Suppose that, for any {t1, . . . , tn} with t1 < · · · < tn,the random variables Xt1 , . . . , Xtn satisfy the following consistency conditions:

1. For all permutations π of 1, . . . , n and x1, . . . , xn we have that P(Xt1 ≤x1, . . . , Xtn ≤ xn) = P(Xtπ(1) ≤ xπ(1), . . . , Xtπ(n) ≤ xπ(n)).

2. For all x1, . . . , xn and tn+1, . . . , tn+m, we have P(Xt1 ≤ x1, . . . , Xtn ≤ xn) =P(Xt1 ≤ x1, . . . , Xtn ≤ xn, Xtn+1 < ∞, . . . , Xtn+m < ∞).

Then, there exists a probability space (",F, P) and a stochastic process Xt : T ×" → Rn having the families Xt1 , . . . , Xtn as finite-dimensional distributions.

Clearly, the simplest case will hold when these random variables are inde-pendent, but this is the territory of standard inference and decision analysis.Stochastic processes adopt their special characteristics when these variables aredependent.

Much as with moments for standard distributions, we shall use some tools tosummarize a stochastic process. The most relevant are, assuming all the involvedmoments exist:

Definition 1.2: For a given stochastic process {Xt , t ∈ T } the mean function is

µX (t) = E[Xt ].

P1: TIX/XYZ P2: ABCJWST172-c01 JWST172-Ruggeri March 3, 2012 15:53

4 BAYESIAN ANALYSIS OF STOCHASTIC PROCESS MODELS

we shall focus on stochastic processes indexed by time, and will call T the spaceof times. When T is discrete, we shall say that the process is in discrete time andwill denote time through n and represent the process through {Xn, n = 0, 1, 2, . . .} .When T is continuous, we shall say that the process is in continuous time. We shallusually assume that T = [0,∞) in this case. The values adopted by the process willbe called the states of the process and will belong to the state space S. Again, S maybe either discrete or continuous.

At least two visions of a stochastic process can be given. First, for each ω ∈ ",we may rewrite Xt (ω) = gω(t) and we have a function of t which is a realization ora sample function of the stochastic process and describes a possible evolution of theprocess through time. Second, for any given t, Xt is a random variable. To completelydescribe the stochastic process, we need a joint description of the family of randomvariables {Xt , t ∈ T }, not just the individual random variables. To do this, we mayprovide a description based on the joint distribution of the random variables at anydiscrete subset of times, that is, for any {t1, . . . , tn} with t1 < · · · < tn , and for any{x1, . . . , xn}, we provide

P(Xt1 ≤ x1, . . . , Xtn ≤ xn

).

Appropriate consistency conditions over these finite-dimensional families of dis-tributions will ensure the definition of the stochastic process, via the Kolmogorovextension theorem, as in, for example, Øksendal (2003).

Theorem 1.1: Let T ⊆ [0,∞). Suppose that, for any {t1, . . . , tn} with t1 < · · · < tn,the random variables Xt1 , . . . , Xtn satisfy the following consistency conditions:

1. For all permutations π of 1, . . . , n and x1, . . . , xn we have that P(Xt1 ≤x1, . . . , Xtn ≤ xn) = P(Xtπ(1) ≤ xπ(1), . . . , Xtπ(n) ≤ xπ(n)).

2. For all x1, . . . , xn and tn+1, . . . , tn+m, we have P(Xt1 ≤ x1, . . . , Xtn ≤ xn) =P(Xt1 ≤ x1, . . . , Xtn ≤ xn, Xtn+1 < ∞, . . . , Xtn+m < ∞).

Then, there exists a probability space (",F, P) and a stochastic process Xt : T ×" → Rn having the families Xt1 , . . . , Xtn as finite-dimensional distributions.

Clearly, the simplest case will hold when these random variables are inde-pendent, but this is the territory of standard inference and decision analysis.Stochastic processes adopt their special characteristics when these variables aredependent.

Much as with moments for standard distributions, we shall use some tools tosummarize a stochastic process. The most relevant are, assuming all the involvedmoments exist:

Definition 1.2: For a given stochastic process {Xt , t ∈ T } the mean function is

µX (t) = E[Xt ].

P1: TIX/XYZ P2: ABCJWST172-c01 JWST172-Ruggeri March 3, 2012 15:53

4 BAYESIAN ANALYSIS OF STOCHASTIC PROCESS MODELS

we shall focus on stochastic processes indexed by time, and will call T the spaceof times. When T is discrete, we shall say that the process is in discrete time andwill denote time through n and represent the process through {Xn, n = 0, 1, 2, . . .} .When T is continuous, we shall say that the process is in continuous time. We shallusually assume that T = [0,∞) in this case. The values adopted by the process willbe called the states of the process and will belong to the state space S. Again, S maybe either discrete or continuous.

At least two visions of a stochastic process can be given. First, for each ω ∈ ",we may rewrite Xt (ω) = gω(t) and we have a function of t which is a realization ora sample function of the stochastic process and describes a possible evolution of theprocess through time. Second, for any given t, Xt is a random variable. To completelydescribe the stochastic process, we need a joint description of the family of randomvariables {Xt , t ∈ T }, not just the individual random variables. To do this, we mayprovide a description based on the joint distribution of the random variables at anydiscrete subset of times, that is, for any {t1, . . . , tn} with t1 < · · · < tn , and for any{x1, . . . , xn}, we provide

P(Xt1 ≤ x1, . . . , Xtn ≤ xn

).

Appropriate consistency conditions over these finite-dimensional families of dis-tributions will ensure the definition of the stochastic process, via the Kolmogorovextension theorem, as in, for example, Øksendal (2003).

Theorem 1.1: Let T ⊆ [0,∞). Suppose that, for any {t1, . . . , tn} with t1 < · · · < tn,the random variables Xt1 , . . . , Xtn satisfy the following consistency conditions:

1. For all permutations π of 1, . . . , n and x1, . . . , xn we have that P(Xt1 ≤x1, . . . , Xtn ≤ xn) = P(Xtπ(1) ≤ xπ(1), . . . , Xtπ(n) ≤ xπ(n)).

2. For all x1, . . . , xn and tn+1, . . . , tn+m, we have P(Xt1 ≤ x1, . . . , Xtn ≤ xn) =P(Xt1 ≤ x1, . . . , Xtn ≤ xn, Xtn+1 < ∞, . . . , Xtn+m < ∞).

Then, there exists a probability space (",F, P) and a stochastic process Xt : T ×" → Rn having the families Xt1 , . . . , Xtn as finite-dimensional distributions.

Clearly, the simplest case will hold when these random variables are inde-pendent, but this is the territory of standard inference and decision analysis.Stochastic processes adopt their special characteristics when these variables aredependent.

Much as with moments for standard distributions, we shall use some tools tosummarize a stochastic process. The most relevant are, assuming all the involvedmoments exist:

Definition 1.2: For a given stochastic process {Xt , t ∈ T } the mean function is

µX (t) = E[Xt ].

Processi stocastici // Descrizione di sintesi di un processo

• Funzioni descrittive: media al tempo t1

P1: TIX/XYZ P2: ABCJWST172-c01 JWST172-Ruggeri March 3, 2012 15:53

4 BAYESIAN ANALYSIS OF STOCHASTIC PROCESS MODELS

P(Xt1 ≤ x1, . . . , Xtn ≤ xn

).

Definition 1.2: For a given stochastic process {Xt , t ∈ T } the mean function is

µX (t) = E[Xt ].

In many practical situations, it is necessary to consider only the first orsecond-order stationary processes. This can be further simplified by introducing aterm of a wide-sense stationary process, considering a single process, or jointlywide-sense stationary processes. Wide-sense and jointly wide-sense processes aredefined in Sect. 6.5.

6.4 Mean Value

A mean value considered in Chap. 2 for a single variable can also be introduced fora process observed in a particular time instant, as shown in Fig. 6.6. The process in atime instant t1 is a random variable X1(t1) ¼ X1. Let the continuous range ofvariable X1 be divided into k elements Dx, such that they are so small that if the

Fig. 6.6 Explanation ofmean value of process

6.4 Mean Value 379

variable is in the interval Dxwe say that it is equal to Dx. Similarly, if the variable isin the ith interval iDx we say that it is equal to iDx.

Similarly, as in the case of a single variable, we first find an empirical meanvalue, as a result of a performed experiment.

The experiment is repeated N times under the same conditions. Process X(t) isobtained by assigning N realizations x(t) to each experiment’s outcome. Considerall realizations in the time instant t1, and suppose that we know the following:

N1 realizations are in the interval ½0;Dx"ðlike realization xnþ1ðtÞ in Fig:6:6Þ;N2 realizations are in the interval ½Dx; 2Dx"ðlike realization xn&1ðtÞ in Fig:6:6Þ;. . .

Ni realizations are in the interval ½ði& 1ÞDx; iDx" ðlike realization xnðtÞ in Fig:6:6Þ;. . .

Nk realizations are in the last interval [ðk & 1Dx; kDxÞ":

An empirical mean value of the process in the time instant t1 is equal to:

Xemp:avðt1Þ ¼N1Dxþ N22Dxþ ( ( ( þ NiiDxþ ( ( ( þ NkkDx

N

¼Xk

i¼1

NiNiDx ¼

Xk

i¼1

NiN

1

Dx

! "iDxDx: (6.21)

Because N is very large and Dx very small, an empirical mean value becomesindependent of an experiment and it approaches the mean value of a process:

Xemp:av: ! Xðt1Þ ¼ mðt1Þ ¼ð1

&1

xfXðx; t1Þdx; as N ! 1; Dx ! dx; iDx ! x

(6.22)

In another time instant t2, it is possible for us to obtain a different mean value:

mðt2Þ ¼ð1

&1

xfXðx; t2Þdx: (6.23)

In general, a mean value of the process depends on time:

mðtÞ ¼ XðtÞ ¼ E XðtÞf g ¼ð1

&1

xfXðx; tÞdx: (6.24)

380 6 Random Processes

realizzazioni nell’intervallo

realizzazioni nell’intervallo

realizzazioni nell’intervallo

realizzazioni nell’intervallo

variable is in the interval Dxwe say that it is equal to Dx. Similarly, if the variable isin the ith interval iDx we say that it is equal to iDx.

Similarly, as in the case of a single variable, we first find an empirical meanvalue, as a result of a performed experiment.

The experiment is repeated N times under the same conditions. Process X(t) isobtained by assigning N realizations x(t) to each experiment’s outcome. Considerall realizations in the time instant t1, and suppose that we know the following:

N1 realizations are in the interval ½0;Dx"ðlike realization xnþ1ðtÞ in Fig:6:6Þ;N2 realizations are in the interval ½Dx; 2Dx"ðlike realization xn&1ðtÞ in Fig:6:6Þ;. . .

Ni realizations are in the interval ½ði& 1ÞDx; iDx" ðlike realization xnðtÞ in Fig:6:6Þ;. . .

Nk realizations are in the last interval [ðk & 1Dx; kDxÞ":

An empirical mean value of the process in the time instant t1 is equal to:

Xemp:avðt1Þ ¼N1Dxþ N22Dxþ ( ( ( þ NiiDxþ ( ( ( þ NkkDx

N

¼Xk

i¼1

NiNiDx ¼

Xk

i¼1

NiN

1

Dx

! "iDxDx: (6.21)

Because N is very large and Dx very small, an empirical mean value becomesindependent of an experiment and it approaches the mean value of a process:

Xemp:av: ! Xðt1Þ ¼ mðt1Þ ¼ð1

&1

xfXðx; t1Þdx; as N ! 1; Dx ! dx; iDx ! x

(6.22)

In another time instant t2, it is possible for us to obtain a different mean value:

mðt2Þ ¼ð1

&1

xfXðx; t2Þdx: (6.23)

In general, a mean value of the process depends on time:

mðtÞ ¼ XðtÞ ¼ E XðtÞf g ¼ð1

&1

xfXðx; tÞdx: (6.24)

380 6 Random Processes

variable is in the interval Dxwe say that it is equal to Dx. Similarly, if the variable isin the ith interval iDx we say that it is equal to iDx.

Similarly, as in the case of a single variable, we first find an empirical meanvalue, as a result of a performed experiment.

The experiment is repeated N times under the same conditions. Process X(t) isobtained by assigning N realizations x(t) to each experiment’s outcome. Considerall realizations in the time instant t1, and suppose that we know the following:

N1 realizations are in the interval ½0;Dx"ðlike realization xnþ1ðtÞ in Fig:6:6Þ;N2 realizations are in the interval ½Dx; 2Dx"ðlike realization xn&1ðtÞ in Fig:6:6Þ;. . .

Ni realizations are in the interval ½ði& 1ÞDx; iDx" ðlike realization xnðtÞ in Fig:6:6Þ;. . .

Nk realizations are in the last interval [ðk & 1Dx; kDxÞ":

An empirical mean value of the process in the time instant t1 is equal to:

Xemp:avðt1Þ ¼N1Dxþ N22Dxþ ( ( ( þ NiiDxþ ( ( ( þ NkkDx

N

¼Xk

i¼1

NiNiDx ¼

Xk

i¼1

NiN

1

Dx

! "iDxDx: (6.21)

Because N is very large and Dx very small, an empirical mean value becomesindependent of an experiment and it approaches the mean value of a process:

Xemp:av: ! Xðt1Þ ¼ mðt1Þ ¼ð1

&1

xfXðx; t1Þdx; as N ! 1; Dx ! dx; iDx ! x

(6.22)

In another time instant t2, it is possible for us to obtain a different mean value:

mðt2Þ ¼ð1

&1

xfXðx; t2Þdx: (6.23)

In general, a mean value of the process depends on time:

mðtÞ ¼ XðtÞ ¼ E XðtÞf g ¼ð1

&1

xfXðx; tÞdx: (6.24)

380 6 Random Processes

Ni realizations are in the interval ½ði& 1ÞDx; iDx" ðlike realization xnðtÞ in Fig:6:6Þ;. . .

Nk realizations are in the last interval [ðk & 1Dx; kDxÞ":

An empirical mean value of the process in the time instant t1 is equal to:

Xemp:avðt1Þ ¼N1Dxþ N22Dxþ ( ( ( þ NiiDxþ ( ( ( þ NkkDx

N

¼Xk

i¼1

NiNiDx ¼

Xk

i¼1

NiN

1

Dx

! "iDxDx: (6.21)

Xemp:av: ! Xðt1Þ ¼ mðt1Þ ¼ð1

&1

xfXðx; t1Þdx; as N ! 1; Dx ! dx; iDx ! x

(6.22)

In another time instant t2, it is possible for us to obtain a different mean value:

mðt2Þ ¼ð1

&1

xfXðx; t2Þdx: (6.23)

In general, a mean value of the process depends on time:

mðtÞ ¼ XðtÞ ¼ E XðtÞf g ¼ð1

&1

xfXðx; tÞdx: (6.24)

380 6 Random Processes

Ni realizations are in the interval ½ði& 1ÞDx; iDx" ðlike realization xnðtÞ in Fig:6:6Þ;. . .

Nk realizations are in the last interval [ðk & 1Dx; kDxÞ":

An empirical mean value of the process in the time instant t1 is equal to:

Xemp:avðt1Þ ¼N1Dxþ N22Dxþ ( ( ( þ NiiDxþ ( ( ( þ NkkDx

N

¼Xk

i¼1

NiNiDx ¼

Xk

i¼1

NiN

1

Dx

! "iDxDx: (6.21)

Xemp:av: ! Xðt1Þ ¼ mðt1Þ ¼ð1

&1

xfXðx; t1Þdx; as N ! 1; Dx ! dx; iDx ! x

(6.22)

In another time instant t2, it is possible for us to obtain a different mean value:

mðt2Þ ¼ð1

&1

xfXðx; t2Þdx: (6.23)

In general, a mean value of the process depends on time:

mðtÞ ¼ XðtÞ ¼ E XðtÞf g ¼ð1

&1

xfXðx; tÞdx: (6.24)

380 6 Random Processes

Ni realizations are in the interval ½ði& 1ÞDx; iDx" ðlike realization xnðtÞ in Fig:6:6Þ;. . .

Nk realizations are in the last interval [ðk & 1Dx; kDxÞ":

An empirical mean value of the process in the time instant t1 is equal to:

Xemp:avðt1Þ ¼N1Dxþ N22Dxþ ( ( ( þ NiiDxþ ( ( ( þ NkkDx

N

¼Xk

i¼1

NiNiDx ¼

Xk

i¼1

NiN

1

Dx

! "iDxDx: (6.21)

Xemp:av: ! Xðt1Þ ¼ mðt1Þ ¼ð1

&1

xfXðx; t1Þdx; as N ! 1; Dx ! dx; iDx ! x

(6.22)

In another time instant t2, it is possible for us to obtain a different mean value:

mðt2Þ ¼ð1

&1

xfXðx; t2Þdx: (6.23)

In general, a mean value of the process depends on time:

mðtÞ ¼ XðtÞ ¼ E XðtÞf g ¼ð1

&1

xfXðx; tÞdx: (6.24)

380 6 Random Processes

Media empirica

Processi stocastici // Descrizione di sintesi di un processo

• Funzioni descrittive: media al tempo t1

P1: TIX/XYZ P2: ABCJWST172-c01 JWST172-Ruggeri March 3, 2012 15:53

4 BAYESIAN ANALYSIS OF STOCHASTIC PROCESS MODELS

P(Xt1 ≤ x1, . . . , Xtn ≤ xn

).

Definition 1.2: For a given stochastic process {Xt , t ∈ T } the mean function is

µX (t) = E[Xt ].

In many practical situations, it is necessary to consider only the first orsecond-order stationary processes. This can be further simplified by introducing aterm of a wide-sense stationary process, considering a single process, or jointlywide-sense stationary processes. Wide-sense and jointly wide-sense processes aredefined in Sect. 6.5.

6.4 Mean Value

A mean value considered in Chap. 2 for a single variable can also be introduced fora process observed in a particular time instant, as shown in Fig. 6.6. The process in atime instant t1 is a random variable X1(t1) ¼ X1. Let the continuous range ofvariable X1 be divided into k elements Dx, such that they are so small that if the

Fig. 6.6 Explanation ofmean value of process

6.4 Mean Value 379

Ni realizations are in the interval ½ði& 1ÞDx; iDx" ðlike realization xnðtÞ in Fig:6:6Þ;. . .

Nk realizations are in the last interval [ðk & 1Dx; kDxÞ":

An empirical mean value of the process in the time instant t1 is equal to:

Xemp:avðt1Þ ¼N1Dxþ N22Dxþ ( ( ( þ NiiDxþ ( ( ( þ NkkDx

N

¼Xk

i¼1

NiNiDx ¼

Xk

i¼1

NiN

1

Dx

! "iDxDx: (6.21)

Xemp:av: ! Xðt1Þ ¼ mðt1Þ ¼ð1

&1

xfXðx; t1Þdx; as N ! 1; Dx ! dx; iDx ! x

(6.22)

In another time instant t2, it is possible for us to obtain a different mean value:

mðt2Þ ¼ð1

&1

xfXðx; t2Þdx: (6.23)

In general, a mean value of the process depends on time:

mðtÞ ¼ XðtÞ ¼ E XðtÞf g ¼ð1

&1

xfXðx; tÞdx: (6.24)

380 6 Random Processes

Media empirica

Ni realizations are in the interval ½ði& 1ÞDx; iDx" ðlike realization xnðtÞ in Fig:6:6Þ;. . .

Nk realizations are in the last interval [ðk & 1Dx; kDxÞ":

An empirical mean value of the process in the time instant t1 is equal to:

Xemp:avðt1Þ ¼N1Dxþ N22Dxþ ( ( ( þ NiiDxþ ( ( ( þ NkkDx

N

¼Xk

i¼1

NiNiDx ¼

Xk

i¼1

NiN

1

Dx

! "iDxDx: (6.21)

Xemp:av: ! Xðt1Þ ¼ mðt1Þ ¼ð1

&1

xfXðx; t1Þdx; as N ! 1; Dx ! dx; iDx ! x

(6.22)

In another time instant t2, it is possible for us to obtain a different mean value:

mðt2Þ ¼ð1

&1

xfXðx; t2Þdx: (6.23)

In general, a mean value of the process depends on time:

mðtÞ ¼ XðtÞ ¼ E XðtÞf g ¼ð1

&1

xfXðx; tÞdx: (6.24)

380 6 Random Processes

P1: TIX/XYZ P2: ABCJWST172-c01 JWST172-Ruggeri March 3, 2012 15:53

4 BAYESIAN ANALYSIS OF STOCHASTIC PROCESS MODELS

P(Xt1 ≤ x1, . . . , Xtn ≤ xn

).

Definition 1.2: For a given stochastic process {Xt , t ∈ T } the mean function is

µX (t) = E[Xt ].

Ni realizations are in the interval ½ði& 1ÞDx; iDx" ðlike realization xnðtÞ in Fig:6:6Þ;. . .

Nk realizations are in the last interval [ðk & 1Dx; kDxÞ":

An empirical mean value of the process in the time instant t1 is equal to:

Xemp:avðt1Þ ¼N1Dxþ N22Dxþ ( ( ( þ NiiDxþ ( ( ( þ NkkDx

N

¼Xk

i¼1

NiNiDx ¼

Xk

i¼1

NiN

1

Dx

! "iDxDx: (6.21)

Xemp:av: ! Xðt1Þ ¼ mðt1Þ ¼ð1

&1

xfXðx; t1Þdx; as N ! 1; Dx ! dx; iDx ! x

(6.22)

In another time instant t2, it is possible for us to obtain a different mean value:

mðt2Þ ¼ð1

&1

xfXðx; t2Þdx: (6.23)

In general, a mean value of the process depends on time:

mðtÞ ¼ XðtÞ ¼ E XðtÞf g ¼ð1

&1

xfXðx; tÞdx: (6.24)

380 6 Random Processes

Ni realizations are in the interval ½ði& 1ÞDx; iDx" ðlike realization xnðtÞ in Fig:6:6Þ;. . .

Nk realizations are in the last interval [ðk & 1Dx; kDxÞ":

An empirical mean value of the process in the time instant t1 is equal to:

Xemp:avðt1Þ ¼N1Dxþ N22Dxþ ( ( ( þ NiiDxþ ( ( ( þ NkkDx

N

¼Xk

i¼1

NiNiDx ¼

Xk

i¼1

NiN

1

Dx

! "iDxDx: (6.21)

Xemp:av: ! Xðt1Þ ¼ mðt1Þ ¼ð1

&1

xfXðx; t1Þdx; as N ! 1; Dx ! dx; iDx ! x

(6.22)

In another time instant t2, it is possible for us to obtain a different mean value:

mðt2Þ ¼ð1

&1

xfXðx; t2Þdx: (6.23)

In general, a mean value of the process depends on time:

mðtÞ ¼ XðtÞ ¼ E XðtÞf g ¼ð1

&1

xfXðx; tÞdx: (6.24)

380 6 Random Processes

Ni realizations are in the interval ½ði& 1ÞDx; iDx" ðlike realization xnðtÞ in Fig:6:6Þ;. . .

Nk realizations are in the last interval [ðk & 1Dx; kDxÞ":

An empirical mean value of the process in the time instant t1 is equal to:

Xemp:avðt1Þ ¼N1Dxþ N22Dxþ ( ( ( þ NiiDxþ ( ( ( þ NkkDx

N

¼Xk

i¼1

NiNiDx ¼

Xk

i¼1

NiN

1

Dx

! "iDxDx: (6.21)

Xemp:av: ! Xðt1Þ ¼ mðt1Þ ¼ð1

&1

xfXðx; t1Þdx; as N ! 1; Dx ! dx; iDx ! x

(6.22)

In another time instant t2, it is possible for us to obtain a different mean value:

mðt2Þ ¼ð1

&1

xfXðx; t2Þdx: (6.23)

In general, a mean value of the process depends on time:

mðtÞ ¼ XðtÞ ¼ E XðtÞf g ¼ð1

&1

xfXðx; tÞdx: (6.24)

380 6 Random Processes

Ni realizations are in the interval ½ði& 1ÞDx; iDx" ðlike realization xnðtÞ in Fig:6:6Þ;. . .

Nk realizations are in the last interval [ðk & 1Dx; kDxÞ":

An empirical mean value of the process in the time instant t1 is equal to:

Xemp:avðt1Þ ¼N1Dxþ N22Dxþ ( ( ( þ NiiDxþ ( ( ( þ NkkDx

N

¼Xk

i¼1

NiNiDx ¼

Xk

i¼1

NiN

1

Dx

! "iDxDx: (6.21)

Xemp:av: ! Xðt1Þ ¼ mðt1Þ ¼ð1

&1

xfXðx; t1Þdx; as N ! 1; Dx ! dx; iDx ! x

(6.22)

In another time instant t2, it is possible for us to obtain a different mean value:

mðt2Þ ¼ð1

&1

xfXðx; t2Þdx: (6.23)

In general, a mean value of the process depends on time:

mðtÞ ¼ XðtÞ ¼ E XðtÞf g ¼ð1

&1

xfXðx; tÞdx: (6.24)

380 6 Random Processes

Ni realizations are in the interval ½ði& 1ÞDx; iDx" ðlike realization xnðtÞ in Fig:6:6Þ;. . .

Nk realizations are in the last interval [ðk & 1Dx; kDxÞ":

An empirical mean value of the process in the time instant t1 is equal to:

Xemp:avðt1Þ ¼N1Dxþ N22Dxþ ( ( ( þ NiiDxþ ( ( ( þ NkkDx

N

¼Xk

i¼1

NiNiDx ¼

Xk

i¼1

NiN

1

Dx

! "iDxDx: (6.21)

Xemp:av: ! Xðt1Þ ¼ mðt1Þ ¼ð1

&1

xfXðx; t1Þdx; as N ! 1; Dx ! dx; iDx ! x

(6.22)

In another time instant t2, it is possible for us to obtain a different mean value:

mðt2Þ ¼ð1

&1

xfXðx; t2Þdx: (6.23)

In general, a mean value of the process depends on time:

mðtÞ ¼ XðtÞ ¼ E XðtÞf g ¼ð1

&1

xfXðx; tÞdx: (6.24)

380 6 Random Processes

Ni realizations are in the interval ½ði& 1ÞDx; iDx" ðlike realization xnðtÞ in Fig:6:6Þ;. . .

Nk realizations are in the last interval [ðk & 1Dx; kDxÞ":

An empirical mean value of the process in the time instant t1 is equal to:

Xemp:avðt1Þ ¼N1Dxþ N22Dxþ ( ( ( þ NiiDxþ ( ( ( þ NkkDx

N

¼Xk

i¼1

NiNiDx ¼

Xk

i¼1

NiN

1

Dx

! "iDxDx: (6.21)

Xemp:av: ! Xðt1Þ ¼ mðt1Þ ¼ð1

&1

xfXðx; t1Þdx; as N ! 1; Dx ! dx; iDx ! x

(6.22)

In another time instant t2, it is possible for us to obtain a different mean value:

mðt2Þ ¼ð1

&1

xfXðx; t2Þdx: (6.23)

In general, a mean value of the process depends on time:

mðtÞ ¼ XðtÞ ¼ E XðtÞf g ¼ð1

&1

xfXðx; tÞdx: (6.24)

380 6 Random Processes

Processi stocastici // Descrizione di sintesi di un processo

• Funzioni descrittive: in generale la media al tempo t è

P1: TIX/XYZ P2: ABCJWST172-c01 JWST172-Ruggeri March 3, 2012 15:53

4 BAYESIAN ANALYSIS OF STOCHASTIC PROCESS MODELS

P(Xt1 ≤ x1, . . . , Xtn ≤ xn

).

Definition 1.2: For a given stochastic process {Xt , t ∈ T } the mean function is

µX (t) = E[Xt ].

In many practical situations, it is necessary to consider only the first orsecond-order stationary processes. This can be further simplified by introducing aterm of a wide-sense stationary process, considering a single process, or jointlywide-sense stationary processes. Wide-sense and jointly wide-sense processes aredefined in Sect. 6.5.

6.4 Mean Value

A mean value considered in Chap. 2 for a single variable can also be introduced fora process observed in a particular time instant, as shown in Fig. 6.6. The process in atime instant t1 is a random variable X1(t1) ¼ X1. Let the continuous range ofvariable X1 be divided into k elements Dx, such that they are so small that if the

Fig. 6.6 Explanation ofmean value of process

6.4 Mean Value 379

Media o valore atteso del processo

al tempo t

Ni realizations are in the interval ½ði& 1ÞDx; iDx" ðlike realization xnðtÞ in Fig:6:6Þ;. . .

Nk realizations are in the last interval [ðk & 1Dx; kDxÞ":

An empirical mean value of the process in the time instant t1 is equal to:

Xemp:avðt1Þ ¼N1Dxþ N22Dxþ ( ( ( þ NiiDxþ ( ( ( þ NkkDx

N

¼Xk

i¼1

NiNiDx ¼

Xk

i¼1

NiN

1

Dx

! "iDxDx: (6.21)

Because N is very large