international technology alliance in network & information sciences international technology...

8
1 International Technology Alliance In Network & Information Sciences A framework for QoI-inspired analysis for sensor network deployment planning Sadaf Zahedi EE Department, UCLA Chatschik Bisdikian T. J. Watson Research Center, IBM US

Upload: chloe-cochran

Post on 27-Mar-2015

229 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: International Technology Alliance In Network & Information Sciences International Technology Alliance In Network & Information Sciences 1 A framework for

1

International Technology AllianceIn Network & Information Sciences

International Technology AllianceIn Network & Information Sciences

A framework for QoI-inspired analysis for sensor network

deployment planning

A framework for QoI-inspired analysis for sensor network

deployment planning

Sadaf Zahedi EE Department, UCLA

Chatschik BisdikianT. J. Watson Research Center, IBM US

Page 2: International Technology Alliance In Network & Information Sciences International Technology Alliance In Network & Information Sciences 1 A framework for

2

Problem Statement

Observation field

Sensor deployment field

S1

S2S3

S4

S5

Event S*(t)

designer

Sensor-collected dataIs QoI good

enough?

d1

d2 d3 d4

d5

Objective Goal: Evaluate, and ultimately optimize the quality of information (QoI) of the sensor networks, which support sensor-based applicationsQoI Definition: QoI is the collective effect of the (accessible) knowledge, derived from the sensor-collected data, to determine the degree of accuracy Event Detection is common in most of the sensor–based applications such as:

surveillance and intelligence gathering, detecting presence of enemy weaponry, hostile activities (e.g., gunfire, explosion), and etc

QoI attributes of importance for event detection class of applications are:

Detection probability (Pd), of correctly detecting the occurrence of an eventFalse alarm probability (Pf), of declaring the occurrence when it did not occur Error probability (Pe), of making any kind of error in decision

Page 3: International Technology Alliance In Network & Information Sciences International Technology Alliance In Network & Information Sciences 1 A framework for

3

detectionsubsystem

event signature:signal s*(t)

fusionsubsystem

fusionsubsystem

Fusionsubsystem

noise n(t)

1{ , , }Nr r

Communications

M

Lsignal

propagation

sensorsubsystem(sampler)

sensorsubsystemSensor

subsystem

1{ , , }k

k kNs s

1{ , , }k

k kN Tt t W

( )ks t

1{ , , }Ns s

Communications

E?E?

0.00

0.20

0.40

0.60

0.80

1.00

1.20

time, t

s(t

)

signal

measurements

y

( )s t

time, t

Samples of the “Projection” of the event signal s*(t)

Anchored on the core analysis engine, a system-level analysis framework can be developed that contains the required system parameters, to provide the knowledge of the signal projections at the sensor locations.

Anchored on the core analysis engine, a system-level analysis framework can be developed that contains the required system parameters, to provide the knowledge of the signal projections at the sensor locations.

Core fusion & detection analysis engine

Reference Detection System

Page 4: International Technology Alliance In Network & Information Sciences International Technology Alliance In Network & Information Sciences 1 A framework for

4

topology, costconstraintstopology, cost

constraints

sampling policy

others

propagation/attenuation

model

sampling policy

integration(e.g., averaging)

optimization(e.g., select best

deployment plan)

signal s(t)signal(s)

s*(t)

measurement error

model(s)

core QoI analysis engine

core QoI analysis engine

propagation/attenuation

model(s)

sampling policy(ies)

integration(e.g., averaging)

• Planner

provides deployment topology, QoI objectives, cost constraints, application domains, etc.

• Designer

provides sample policies, and system models (libraries)

• Planner

provides deployment topology, QoI objectives, cost constraints, application domains, etc.

• Designer

provides sample policies, and system models (libraries)

• Deployment plan• Deployment plan

optimization(e.g., select best

deployment plan)

DetectionTest Tools

InputPre-processing

OutputPost-processing

noisemodel(s)

• Good enough?

• What if scenarios

• ….

• Good enough?

• What if scenarios

• ….

QoI Analysis Framework/Toolkit Architecture

Page 5: International Technology Alliance In Network & Information Sciences International Technology Alliance In Network & Information Sciences 1 A framework for

5

Core QoI Analysis Engine

• Binary Hypothesis testing:

– Hypothesis H1: ri= si+ni i=1,2,…,N event occurrence

– Hypothesis H0: ri= ni i=1,2,…,N no event

• The Likelihood Ratio Test (LRT):

– fRN |Hi(rN) represents the pdf conditionen on Hi

– η=P0/P1 Bayesian threshold

• Decision Test:

– where C is the noise covariance matrix C=E{nTn}

– n=[n1,n2,….,nN]

1

00

1

|

|

( )( )

( )N

N

R H NN

R H N

Hf r

rf r

H

1 * 1

0

1

1) )

2( ln( ) (

T TC S C S

H

l r S

H

ln( )*Pr( | ) 1 ( ) 1

2

ln( )* Pr( | ) 1 ( ) 0

2

* * (1 ) 0 1

QoI performance metrics (sensor level)

k

k

k

k

P l Hd

P l Hf

P P P P Pe f D

2 1Signal-to-noise Ratio (SNR): Tk S C S

sensor level System level

(@ central decision maker)2 1 2

SNR: 1T MS C Ssys k k

ln( )*Pr( | ) 1 ( ) 1

2

ln( ) sys* Pr( | ) 1 ( ) 0

2

QoI performance metrics (system level)

sysP l Hd

sys

P l Hf

sys

Page 6: International Technology Alliance In Network & Information Sciences International Technology Alliance In Network & Information Sciences 1 A framework for

6Fully Distributed Detection (L=M) vs. Centralized (L=1)

2{ , ..., }1

2 2S S N

?*

E

l

2 1 2

1SNR:

ln( )1 ( )

2

ln( ) 1 ( )

2f

MTsys kk

sys

sys

sys

sys

S C S

Pd

P

Sensor Subsystem 1 Communication

Communication

1

1 1{ , ..., }1R RN

2

2{ , ..., }1

2N

R R

{ , ..., }1 MR RN

M M

1

1 1{ , ..., }1S S N

{ , ..., }1 Ms sN

M M

FusionSubsystem 1

Make decisionBased on the

detection policy (Q)

Local Decision 1

Sensor Subsystem 2

Sensor Subsystem M

FusionSubsystem 2

Communication

FusionSubsystem M

Local Decision 2

Local Decision M

Noise

Fully distributed detection (L=M)

FusionSubsystem

Make decision

Centralized detection (L=1)

ln( )1 ( )

2

ln( )1 ( )

2f

kk

k

kk

k

Pd

P

1( ; ) Pr( | )

{ [( )( (1 ))]}M

q qq q

d

MK K

d dq Q sensor sensorX S

k X k X

P Q M q Q H

P P

system levelQoI metrics

localQoI metrics

iterative calculation ofQoI parameters

Page 7: International Technology Alliance In Network & Information Sciences International Technology Alliance In Network & Information Sciences 1 A framework for

7

Performance Comparisonη=2η=1 η=0.5

eventS1

S2

S3 S4

d1

d3d4

d2

Sampling Policy

Same number of samples from each sensor

Same sampling rate

Signal Signature

S*(t)=1-t2

Attenuation Model

ak=1/(1+dk2)

Delay Model τk=dk /vk

Propagation Model

Sk(t)=akS*(t- τk)

η=P0/P1

Page 8: International Technology Alliance In Network & Information Sciences International Technology Alliance In Network & Information Sciences 1 A framework for

8

Conclusion

• QoI-based framework for analyzing rather non-homogenous systems

– For a finite number of sensors, transient signals, arbitrary sensor deployment, and different noise level at each sensor

• Framework facilitates

– Decoupling of analysis approach in three steps (input preprocessing, QoI core analysis, output post processing)

– Mix-and-match of different analysis, and modeling approaches

• Compared the centralized vs. distributed detection architectures with respect to QoI

• Influence of the priori knowledge on selection of the best detection policy for distributed schemes

Future work

• Deployment algorithm which optimize both QoI and cost subject to constraints

• Extension of the noise models to models with spatio-temporal correlation

• Consider the measurement error models (i.e., errors from the faulty sensors, …)