distributed multi-scale data processing for sensor networks

Post on 18-Jan-2016

50 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Distributed Multi-Scale Data Processing for Sensor Networks. Raymond S. Wagner Ph.D. Thesis Defense April 9, 2007. Collaborators. Marco Duarte. J. Ryan Stinnett. V é ronique Delouille. T.S. Eugene Ng. David B. Johnson. Albert Cohen. Shu Du. Richard Baraniuk. Shriram Sarvotham. - PowerPoint PPT Presentation

TRANSCRIPT

Distributed Multi-Scale Data Processing for Sensor Networks

Raymond S. Wagner

Ph.D. Thesis DefenseApril 9, 2007

Collaborators

Marco Duarte

Véronique Delouille

Richard Baraniuk

David B. Johnson

J. Ryan Stinnett

T.S. Eugene Ng

Albert Cohen

Shu Du

Shriram Sarvotham

Collections of small battery-powereddevices, called sensor nodes, that can:

Sensor Network Overview

• sense data

• process data

• share data

Nodes form ad-hoc networks to exchange data:

PROBLEM: centralized collection very costly (power, bandwidth), especially near sink.

network bottleneck

region

Data Collection Problem

SOLUTION: nodes locally exchange data with neighbors, finding answers to questions in-network.

Distributed Processing Solution

Distributed Data Representations

Little / No Collaboration

Wide Meas.Field Support

Quick Decode

Dist. Source Coding

Dist. CompressedSensing

Dist. Multi-Scale Analysis

Dist. Regression

Novel Contributions

• development of multi- scale transform

• survey of application communication requirements

• analysis of numerical stability

• development of protocols

• analysis of energy cost

• development of API

new algorithms support for algorithms

Multi-scale Wavelet AnalysisUnconditional basis for wide range of signal classes – good choice for sparse representation when little known about signal.

WT

Project onto to find scaling coefficient , or find from previous-scale SCs as

Fix V0 with scaling function basis set , with

Multi-Resolution Analysis (MRA)Vj+1

Vj

Vj-1

Define difference spaces Wj s.t.

Wavelet Space Analysis

Vj+1

Vj

Vj-1Wj-1Wj

Give W0 wavelet function basis set

Project onto to find wavelet coefficient or find from previous-scale SC’s as

MRA assumes regular sample point spacing, power-of-two sample size. Not likely in sensor networks.

Wavelet Analysis for Sensor Networks

Wavelet Lifting In-place formulation of WT – distributable, tolerates

irregular sampling grids [Sweldens, 1995]

Starts with all nodes ( ) , scalar meas. ( )

At each scale j={J-1,…,j0} , transform into:

wavelet coefficients scaling coefficients

Iterate on SCs to j=j0 so that meas. replaced by:

Lifting Stages

Each transform scale decomposes into three stages: split, predict, update…

split P

+

_U

split P

+

_U

SPLIT into ,

Lifting Stages

Each transform scale decomposes into three stages: split, predict, update…

split P

+

_U

split P

+

_U

PREDICT wavelet coeffs.

Lifting Stages

Each transform scale decomposes into three stages: split, predict, update…

split P

+

_U

split P

+

_U

UPDATE scaling coeffs.

Lifting Stages

Each transform scale decomposes into three stages: split, predict, update…

split P

+

_U

split P

+

_U

SPLIT into ,

Lifting Stages

Each transform scale decomposes into three stages: split, predict, update…

split P

+

_U

split P

+

_U

GOAL: design split,P,U to distributed easily, tolerate grid irregularity, provide sparse representation

Split Design

Scale j+1 Scale j Scale j-1

Goal: mimic regular grid split, s.t. ,

Split Design

Scale j+1 Scale j Scale j-1

Goal: mimic regular grid split, s.t. ,

1. Pick a , put it in ( )

2. Put all with ( )into ( )

3. Repeat until all elements of visited

Approach:

Split Example

Original Grid Scale-5 Grid Scale-4 Grid

Scale-3 Grid Scale-2 Grid Scale-1 Grid

Predict Design

Goal: encode WC at each ( ) as difference from summary of local neighborhood behavior

Approach: fit order-m polynomial to scale-(j+1) SCsat neighboring ( ), evaluate at :

WC for is difference between scale-(j+1) SC and estimate:

, depend only on m, d (dim. of )

Predict Design

Given predict order m, must only specify to find weights

1. Consider points s.t.

2. Pick as smallest (cost) subset s.t.

3. If can’t satisfy, reduce to , repeat Step 1

Use min-norm solution [Jansen et al., 2001]

Approach: choose update weights so that weighted by integrals of constant :

Update Design

Goal: enhance transform stability by preserving average value encoded by SCs across scales

with

Transform Network Traffic

update

predict

Example: , with ,

A function is ( ) at point if polynomial of degree and some

such that

We show that, if is at for , then

depends only on constants

Coefficient Decay

WT Application: Distributed Compression

IDEA: compress measurements by only allowing sensors with large-magnitude WCs to transmit to the sink.

Compression Evaluation

Sample field classes:

Piecewise smoothacross discontinuity

Globally smooth

Compressing Smooth Fields

(250 nodes, 100 trials)

aver

age

MS

E

number of coefficients

P onlyP, U

Compressing Piecewise-Smooth Fieldsav

erag

e M

SE

number of coefficients (250 nodes, 100 trials)

P onlyP, U

Energy vs. Distortion (Smooth Field)

MS

E

bottleneck energy (Joules) (1000 nodes)

Energy vs. Distortion (Smooth Field)

Energy to compute WT

MS

E

bottleneck energy (Joules) (1000 nodes)

Energy vs. Distortion (Smooth Field)

Energy to dump all measurements to sink

MS

E

bottleneck energy (Joules) (1000 nodes)

Energy vs. Distortion (Smooth Field)

Beneficial operating regime

MS

E

bottleneck energy (Joules) (1000 nodes)

Energy vs. Distortion (Piecewise-smooth Field)

MS

E

bottleneck energy (Joules) (1000 nodes)

noise dominates

coefficientcount

PS

NR

1. in-network de-noising (requires inverse dist. WT)

2. compression with de-noising (guides threshold choice)

WT Application: Distributed De-noising

Implementation Lessons

Implemented WT in Duncan Hallsensor network

Need to support common patternswith abstraction to ease algorithmprototyping

Surveyed IPSN 2003-2006

Distilled common comm. patterns into network application programming interface (API) calls

Send to single address – source node ( ) sends message to single destination ( ), drawn from node ID space

Address-Based Sending

Address-Based Sending

Send to list of addresses – source node sends message to multiple destinations, drawn from node ID space

Address-Based Sending

Send to multicast address – source node sends message to single group address, drawn from multi-cast address space

Address-Based API Calls

sendSingle (data, address, effort, hopLimit)

sendList (data, addList, effort, hopLimit)

sendMulti (data, address, effort, hopLimit)

Address-Based API Calls

sendSingle (data, address, effort, hopLimit)

sendList (data, addList, effort, hopLimit)

sendMulti (data, address, effort, hopLimit)

Provide packet fragmentation

Address-Based API Calls

sendSingle (data, address, effort, hopLimit)

sendList (data, addList, effort, hopLimit)

sendMulti (data, address, effort, hopLimit)

Drawn from node-ID address space

Drawn from multicast group address space

Address-Based API Calls

sendSingle (data, address, effort, hopLimit)

sendList (data, addList, effort, hopLimit)

sendMulti (data, address, effort, hopLimit)

Energy-based transmission effort abstraction (per-packet basis)

Address-Based API Calls

sendSingle (data, address, effort, hopLimit)

sendList (data, addList, effort, hopLimit)

sendMulti (data, address, effort, hopLimit)

Limit on number of forwarding hops to destination

Region-Based Sending

Send to hop radius – source node sends message to all nodes within specified number of radio hops

Region-Based Sending

Send to geographic radius – source node sends message to all nodes within specified geographic distance from its location

Region-Based Sending

Send to circle – source node sends message to nodes (single or many) within a specified geographic distance of specified center

Region-Based Sending

Send to polygon – source node sends message to nodes (single or many) within convex hull of specified list of vertex locations

Region-Based API Calls

sendHopRad (data, hopRad, effort, hopLimit)

sendGeoRad (data, geoRad, outHops, effort, hopLimit)

sendCircle (data, centerX, centerY, radius, single, outHops, effort, hopLimit)

sendPolygon (data, vertCount, vertices, single, outHops, effort, hopLimit)

Region-Based API Calls

sendHopRad (data, hopRad, effort, hopLimit)

sendGeoRad (data, geoRad, outHops, effort, hopLimit)

sendCircle (data, centerX, centerY, radius, single, outHops, effort, hopLimit)

sendPolygon (data, vertCount, vertices, single, outHops, effort, hopLimit)

Region specification

Region-Based API Calls

sendHopRad (data, hopRad, effort, hopLimit)

sendGeoRad (data, geoRad, outHops, effort, hopLimit)

sendCircle (data, centerX, centerY, radius, single, outHops, effort, hopLimit)

sendPolygon (data, vertCount, vertices, single, outHops, effort, hopLimit)

Limit number of hops to route outside region to find path around voids

Region-Based API Calls

sendHopRad (data, hopRad, effort, hopLimit)

sendGeoRad (data, geoRad, outHops, effort, hopLimit)

sendCircle (data, centerX, centerY, radius, single, outHops, effort, hopLimit)

sendPolygon (data, vertCount, vertices, single, outHops, effort, hopLimit)

Send to single node or multiple nodes in region

Device Hierarchy Sending

Send to sink – source node sends message to sensor network’s data sink

Device Hierarchy Sending

Send to parent – source node sends message to its parent in hierarchy of device classes

Device Hierarchy Sending

Send to children – source node sends message to its children in hierarchy of device classes

Device Hierarchy Sending

Send to children – source node sends message to its children in hierarchy of device classes

Device Hierarchy API Calls

setLevel (level)

sendParent (data, effort, hopLimit) sendChildren (data, effort, hopLimit)

sendSink (data, effort, hopLimit)

setLevel (level)

sendParent (data, effort, hopLimit) sendChildren (data, effort, hopLimit)

sendSink (data, effort, hopLimit)

Device Hierarchy API Calls

Allow application to assign hierarchy levels suited to device capabilities

Device Hierarchy API Calls

Each level-L device associated with level-(L-1) parent, level-(L+1) children

setLevel (level)

sendParent (data, effort, hopLimit) sendChildren (data, effort, hopLimit)

sendSink (data, effort, hopLimit)

Device Hierarchy API Calls

setLevel (level)

sendParent (data, effort, hopLimit) sendChildren (data, effort, hopLimit)childList = getChildren()

sendSink (data, effort, hopLimit)

Device Hierarchy API Calls

setLevel (level)

sendParent (data, effort, hopLimit) sendChildren (data, effort, hopLimit)childList = getChildren()

sendSink (data, effort, hopLimit)

Each node can send directly to sink (level 1), regardless of level

Receive Modes

1. receiveTarget – node can examine observe messages for which it is destination

2. receiveOverhear – node can passively examine any message transmitted within radio range

3. receiveForward – node can examine and modify any message it forwards

Finally, three receive modes supported:

Receive Modes

1. receiveTarget – node can examine observe messages for which it is destination

2. receiveOverhear – node can passively examine any message transmitted within radio range

3. receiveForward – node can examine and modify any message it forwards

Finally, three receive modes supported:

Requires hop-by-hop reassembly of packet fragments

Conclusions, Extensions

Developed distributed WT suited to sensor network deployment

Developed network API to support easy algorithm prototyping

• Investigate applicability to other tasks (e.g., query-routing, data recovery)

• Further study tradeoffs between temporal, spatio-temporal processing

• Implement API in resource-efficient manner

This slide left intentionally blank.

For proper inverse WT at sink, each , must receive coeffs. from all ,

Options to ensure include:

1. Require high reliability from routing

2. Repair P,U on link failure

3. Repair P-only on link failure

Communication Reliability Requirements

Starting with threshold , iterate:

Distributed Compression Protocol

1. sink broadcasts 2. each node with sends WC to sink (if not sent already)

3. sink collects WC’s, computes new estimate

4. while estimate residual exceeds some tolerance, repeat Step 1 for

WT Application: Distributed De-noising

IDEA: exploit signal sparsity in WC’s to remove noise energyfrom measurements in wavelet domain

Denote original noisy measurements as , where IID

Can estimate each using modified WC’s

noise dominates

coefficientcount

PS

NR

Two forms of distributed de-noising:

1. in-network de-noising (requires inverse dist. WT)

2. compression with de-noising (guides threshold choice)

De-noising Modes

Universal Thresholding

Donoho/Johnstone’s universal threshold for univariate Gaussian noise

Must scale each coefficient to account for non-orthonormal transform (W) and nose variance :

Each scaling coefficient is modified as:

Distributing Universal Hard Thresholding

To estimate , use median absolute deviation of fine-scale wavelet coefficients :

Use de-centralized median protocol, gossiping with local time-series estimates, single node collection/dissemination…

NOTE: must only estimate appropriate to stationarity of noise process

De-noising Evaluation

Low, smooth

High, smooth

Low, disc.

High, disc.

(PSNR in dB)

Results averaged over 50 trials of N=1000 nodes (randomly, uniformly placed)

De-noising Evaluation (In-Network)

Low, smooth

High, smooth

Low, disc.

High, disc.

De-noising Evaluation (Compression with De-Noising)

Low, smooth

High, smooth

Low, disc.

High, disc.

PS

NR

coefficient count

PS

NR

coefficient count

PS

NR

coefficient count

PS

NR

coefficient count

Bayesuniv.orig.

Bayesuniv.orig.

Bayesuniv.orig.

Bayesuniv.orig.

De-noising Evaluation (Compression with De-Noising)

Low, smooth

High, smooth

Low, disc.

High, disc.

PS

NR

coefficient count

PS

NR

PS

NR

coefficient count

PS

NR

coefficient count

Bayesuniv.orig.

Bayesuniv.orig.

Bayesuniv.orig.

Bayesuniv.orig.

coefficient count

De-noising Evaluation (Compression with De-Noising)

(500 nodes, averaged over 50 trials)

Low, disc.

noise dominates

PS

NR

coefficient count

Bayesuniv.orig.

top related