cooperative event detection in wireless sensor...

12
Cooperative Event Detection in Wireless Sensor Networks Georg Wittenburg INRIA LIX, ´ Ecole Polytechnique Route de Saclay, 91128 Palaiseau, France [email protected] Norman Dziengel, Stephan Adler, Zakaria Kasmi, Marco Ziegert, and Jochen Schiller Department of Mathematics and Computer Science Freie Universit¨at Berlin Takustraße 9, 14195 Berlin, Germany [dziengel,adler,kasmi,ziegert,schiller]@inf.fu-berlin.de Abstract Event detection in wireless sensor networks is a so- phisticated method for processing sampled data di- rectly on the sensor nodes, thereby reducing the need for multi-hop communication with the base station of the network. In contrast to application-agnostic compression or aggregation techniques, event detec- tion pushes application-level knowledge into the net- work. In-network event detection – especially the dis- tributed form involving multiple sensor nodes – has thus an exceptional potential to increase energy effi- ciency, thus prolonging the lifetime of the network. In this paper, we summarize recently proposed sys- tem architectures and algorithms employed for event detection in wireless sensor networks. On the exam- ple of the AVS-Extrem platform, we illustrate how energy-efficient event detection can be implemented through a combination of custom hardware design and distributed event detection algorithms. We then continue to present a brief evaluation of the detection accuracy and the energy consumption that is achiev- able by current systems. 1 Introduction In order to construct highly energy-efficient Wireless Sensor Networks (WSNs), it is paramount to push application-level data processing as deeply into the network as possible. If sampled data is processed and evaluated close to its source, communication with the base station of the network is reduced, energy con- sumption is minimized, and the lifetime of the net- work is thus extended. Event detection in WSNs fol- lows this strategy by programming the sensor nodes in such a way that the occurrence of deployment- specific, semantic events can be established directly on the sensor nodes, relying only on the locally avail- able data from the sensors. This transformation of data from the physical sensory domain into the scenario-specific application domain stands in con- trast to other techniques for in-network data process- ing, e.g., compression and aggregation, which are ag- nostic to the application-level properties of the data. Fundamentally, event detection trades in the general- ity of these latter approaches in favor of higher data reduction rates and thus higher energy efficiency. 1

Upload: vubao

Post on 14-Jul-2018

216 views

Category:

Documents


0 download

TRANSCRIPT

Cooperative Event Detection in Wireless Sensor Networks

Georg Wittenburg

INRIALIX, Ecole Polytechnique

Route de Saclay, 91128 Palaiseau, [email protected]

Norman Dziengel, Stephan Adler, Zakaria Kasmi, Marco Ziegert, and Jochen Schiller

Department of Mathematics and Computer ScienceFreie Universitat Berlin

Takustraße 9, 14195 Berlin, Germany[dziengel,adler,kasmi,ziegert,schiller]@inf.fu-berlin.de

Abstract

Event detection in wireless sensor networks is a so-phisticated method for processing sampled data di-rectly on the sensor nodes, thereby reducing the needfor multi-hop communication with the base stationof the network. In contrast to application-agnosticcompression or aggregation techniques, event detec-tion pushes application-level knowledge into the net-work. In-network event detection – especially the dis-tributed form involving multiple sensor nodes – hasthus an exceptional potential to increase energy effi-ciency, thus prolonging the lifetime of the network.

In this paper, we summarize recently proposed sys-tem architectures and algorithms employed for eventdetection in wireless sensor networks. On the exam-ple of the AVS-Extrem platform, we illustrate howenergy-efficient event detection can be implementedthrough a combination of custom hardware designand distributed event detection algorithms. We thencontinue to present a brief evaluation of the detectionaccuracy and the energy consumption that is achiev-able by current systems.

1 Introduction

In order to construct highly energy-efficient WirelessSensor Networks (WSNs), it is paramount to pushapplication-level data processing as deeply into thenetwork as possible. If sampled data is processed andevaluated close to its source, communication with thebase station of the network is reduced, energy con-sumption is minimized, and the lifetime of the net-work is thus extended. Event detection in WSNs fol-lows this strategy by programming the sensor nodesin such a way that the occurrence of deployment-specific, semantic events can be established directlyon the sensor nodes, relying only on the locally avail-able data from the sensors. This transformationof data from the physical sensory domain into thescenario-specific application domain stands in con-trast to other techniques for in-network data process-ing, e.g., compression and aggregation, which are ag-nostic to the application-level properties of the data.Fundamentally, event detection trades in the general-ity of these latter approaches in favor of higher datareduction rates and thus higher energy efficiency.

1

Centralized approach Decentralized approach

Passive, Active Node Event

Norman Dziengel, Freie Universität Berlin ICCE-Berlin ‘11 07.09.2011 3

Passive, Active Node

Data Traffic

Event

Base StationRadio Link

Figure 1: Centralized vs. decentralized event detection

Event detection in WSNs is commonly used fortasks such as fire or hazard detection [1], vehicletracking [2], area surveillance [13], undersea moni-toring [10], or the classification of human motion se-quences [15]. Depending on the application, a vari-ety of sensors can be employed, including light andtemperature sensors, accelerometers, microphones, orcameras. The raw data gathered by these sensorsis – depending on the sampling rate – too volumi-nous to be transmitted to the base station of theWSN. Instead, one or multiple sensor nodes evaluatethe raw data locally, seeking application-specific pat-terns, e.g., a temperature reading exceeding a thresh-old value (for a fire detection system), a transientchange in a video image (for area surveillance), orthe occurrence of a specific motion path (when mon-itoring human movement). Only the result of thislocalized evaluation is then sent to the base station,and thus, as illustrated in Figure 1, reduces the re-quired network traffic substantially.

In general, designers of a WSN-based event detec-tion system need to make two major architecturalchoices: First, the question arises in how far the sen-sor nodes should cooperate during the detection pro-cess, and second, how the semantically relevant infor-mation is to be extracted from the sampled raw data.The first of these questions involves trade-offs relatedto the communication architecture, while the secondone deals with algorithmic issues during data process-

ing. The overall goals when tackling these questionsare, of course, to minimize communication overhead(and thus energy expenditure) and to maximize thedetection accuracy for a variety of use cases.

In this paper, we summarize and contextualize ourfindings in this field of research based on our previ-ous work on the AVS-Extrem platform for coopera-tive event detection in WSNs, which we built, refined,and evaluated in multiple lab experiments [4, 5] anddeployments [14, 13]. We begin with qualitative com-parisons of possible communication architectures andalgorithms for data processing in Sections 2 and 3 re-spectively, as part of which we also review the state ofthe art. We then illustrate the typical design choicesthat arise when building a deployment-ready eventdetection system in Section 4. In Section 5, we sum-marize experimental results from several deploymentsof the AVS-Extrem platform, and finally concludeand motivate future work in Section 6.

2 Architectures

As data processing and event detection play a keyrole in WSNs, several different approaches have beendeveloped and deployed to process and to transportdata and events within a WSN.

2

2.1 Local Detection

The most basic approach is a local event detection. Inthis approach, each node gathers data from its localsensors and employs local algorithms (cf. Sec. 3) todecide whether a specific event has occurred. Data ofneighboring nodes is not taken into account and allsignal processing is performed on the local node itself.If an event is detected, each node signals the resultsdirectly to the base station of the sensor network.

This approach is very easy to implement, but in-curs a non-negligible communication overhead and isonly applicable to fairly simple types of events. Anexemplary implementation can be found in the fencesurveillance system proposed by Kim et al. [8].

2.2 Centralized Evaluation

Centralized event detection is widely used in currentreal-world deployments. All nodes send either raw orpreprocessed sensory data directly to the base sta-tion, which has significant computational and energyresources. The data is interpreted exclusively on thebase station – individual sensor nodes have no knowl-edge about the semantics of the collected data.

Although this method has several advantages, e.g.,good detection accuracy resulting from the globalknowledge available at the base station, the networkdoes not scale well and the continuous data streamquickly depletes the available energy. Furthermore,the dependency on the base station node introducesa single point of failure, thus rendering this architec-ture unsuitable for security-relevant applications.

Event detection using a centralized evaluation ar-chitecture has been implemented, amongst many oth-ers, by Gu et al. [7] for vehicle tracking.

2.3 Decentralized Evaluation

Decentralized evaluation is one way of mitigating theaforementioned problems of the centralized approach.In this approach, the network is clustered into smallersubnetworks which operate autonomously. Sensorydata is gathered, exchanged, and processed withineach of these clusters. One node in the cluster – theso-called cluster head – takes on the task of com-

municating with the base station when an event isdetected or raw sensor data needs to be transmit-ted. The cluster head is selected at deployment timeand may have additional computational and energyresources. Furthermore, cluster heads may also beequipped with a separate network interface to com-municate directly with the base station, e.g., via thecellular phone network.

The main advantage of this concept is its reliabilitysince the network remains mostly operational even ifindividual subnetworks fail. Another advantage is thepossibility to save energy: As only the nodes in onecluster need to exchange and process data when anevent occurs, nodes in other clusters can remain ina low-power state. The limitations of this approachbecome apparent in ad hoc scenarios, as the networkis impaired by the initial, event-agnostic selection ofcluster heads.

Event detection employing decentralized evalua-tion has been used in large-scale deployments byDuarte and Hu [2] for vehicle classification.

2.4 Distributed Evaluation

In a distributed evaluation, each node processes dataon its own and can take different roles during theevent detection process. If several nodes detect anevent, they communicate with each other and de-cide autonomously, i.e., without the support of a basestation or cluster head, which type of event has oc-curred. The decision about the event type is made bya distributed algorithm running on each node: Nodeswake up as a result of an event, sample and processthe sensory data, and afterwards exchange the resultswith other nodes that also woke up. It is not prede-fined, but rather established during the distributeddetection process, which node reports the detectedevent to the base station. Depending on the appli-cation, it is possible that all nodes in the networkremain in a low-energy state and only wake up uponbeing exposed to an event. If a node of the net-work fails to operate, the network itself remains op-erational and only a minor drop in detection accuracyis to be expected.

The major advantages of this architecture are itsreliability, robustness and the possibility for substan-

3

Table 1: Trade-offs of architectural approaches to event detection in WSNsArchitecture Energy Efficiency Detection Accuracy Exemplary

Implementations

Local detection Medium, since nodes process datalocally and no communication isnecessary. However, a singleevent may be reported to the basestation multiple times by differentnodes.

Only applicable to ap-plications with simpleevents, e.g., those de-tectable by monitor-ing data for exceededthreshold values.

Fence surveillance (Kimet al. [8])

Centralizedevaluation

Low, since a continuous datastream is transmitted from allnodes to the base station.

High, since all data isavailable at the basestation and a varietyof complex algorithmscan easily be applied.

Detection, tracking andclassification of vehicles(Gu et al. [7])

Decentralizedevaluation

Medium, since this is a combina-tion between the previous two ap-proaches.

High, since all data isavailable on each clus-ter head.

Distributed vehicleclassification (Duarteand Hu [2])

Distributedevaluation

High, since data is only ex-changed locally between nodesand events are only reported onceto the base station.

High, since distributedversions of complex al-gorithms can be em-ployed.

Event detection (Mar-tincic and Schwiebert[9]), Fence surveillance(Wittenburg et al. [13])

tial energy savings. A drawback is the system com-plexity that results from its reliance on ad hoc recon-figurations. In particular, the autonomous evaluationneeds to be very reliable as the nodes change rolesand the user has no access to the sensor data andcannot easily verify the correctness of the distributedevaluation.

A system using fully distributed evaluation archi-tecture has been proposed and simulated by Mart-incic and Schwiebert [9], and implemented and de-ployed as part of our own work on fence monitoring(cf. Sec. 4).

The trade-offs between these architectures as wellas exemplary implementations are summarized in Ta-ble 1. As WSNs pose challenges in energy optimiza-tion and therefore need to communicate very spar-ingly, not every architecture is suitable for the par-ticular requirements in this domain. Specifically, thedistributed evaluation approach seems very promis-ing for WSNs that operate on an unknown topologyand require extensive energy savings.

3 Algorithmic Approaches

Expanding upon the architectural view presented inthe previous section, we now shift our focus from thenetworking aspects to the data processing aspects ofevent detection in WSNs. Several alternatives to an-alyzing the collected raw data in order to establishwhether a particular event has occurred have beenproposed in the literature. The overall trade-off inthis area consists of – as we will explore in this section– weighting resource consumption against the com-plexity of the events that the system is able to detect.Another interesting aspect to consider is whether asystem is capable of learning events on its own, orwhether external expert knowledge needs to be em-bedded into the detection algorithm.

3.1 Threshold-based Decisions

Event detection based on comparisons against thresh-old values is the most basic approach to event de-tection WSNs. Obviously, it is applicable to all

4

Table 2: Trade-offs of algorithmic approaches to event detection in WSNsAlgorithm Computational

OverheadApplicability Exemplary

Implementations

Threshold-based decisions

Low, since simple monitor-ing of raw data is sufficient.

Only detects simple events,e.g, “It’s very hot, hencethere must be a fire.”

Vehicle tracking (Gu etal. [7]), fence surveillance(Kim et al. [8])

Patternrecognition

High, since approaches ofthis type involve complexfeature extraction and clas-sification.

Applicable to a variety ofcomplex events, depend-ing on platform supportfor feature extraction andtraining.

Vehicle classification (Du-arte and Hu [2]), humanmotions (Ghasemzadeh etal. [6]), fence surveillance(Wittenburg et al. [13])

Anomalydetection

Varies depending on thecomplexity of the eventsduring normal operation.

Differentiates betweenknown and unknown, i.e,anomalous, patterns.

Light tracking (Walch-li [11])

kinds of sensors that produce a continuous samplingrange, e.g., temperature sensors, magnetometers, mo-tion sensors, microphones. Threshold values, whichcorrespond in their dimensionality to the number ofsensed values, are usually established by a domainexpert and set either at deployment or at run time.Dynamically adjusting threshold values depending onthe observed events is possible, however rarely imple-mented in practice.

The key advantage of this approach is its simplic-ity. Comparisons against threshold values can be im-plemented directly on the sensor nodes that samplethe data. Decentralized or distributed detection ar-chitectures, as discussed in the previous section, arethus not directly applicable to this detection scheme.

Event detection based on threshold values has beenimplemented by Gu et al. [7] as part of the VigilNetproject to track vehicles, and by Kim et al. [8] as partof their fence surveillance system.

3.2 Pattern Recognition

Pattern recognition is the process of classifying a setof data samples into one of multiple classes. Vari-ous techniques for pattern recognition have been em-ployed in different WSN deployments, and we aremerely able to cover the fundamentals in this section.The process of pattern recognition is typically subdi-vided into four steps: sampling, segmentation, fea-

ture extraction, and classification. During sampling,raw data related to an event is gathered by a sensorand optionally preprocessed, e.g., for noise reduction.Segmentation employs thresholds to detect beginningand end of an event. Feature extraction computes aset of highly descriptive features from the data andcombines them into a feature vector. Features maycomprise data such as minimum, maximum, average,histogram, and/or discrete Fourier transform, andtheir extraction from the raw data reduces the di-mensionality while preserving characteristic informa-tion. The feature vector may combine features frommultiple different sensors built into the sensor node;each feature is extracted separately from the respec-tive raw data. During classification, feature vectorsare evaluated, either through statistics or with a pri-ori knowledge from a preceding training session. Al-gorithms for classification include, among many oth-ers, decision trees, neural networks, support vectormachines, and k-nearest neighbors. Since these algo-rithms operate on the feature vector, different num-bers and types of sensors are supported transparently.A detailed introduction into these techniques can befound in Duda et al. [3].

Most of these algorithms require significant com-putational resources and thus need to be modi-fied heavily in order to run on sensor nodes. Fur-thermore, approaches to event detection that em-ploy pattern recognition techniques usually require

5

a scenario-specific training to initialize the classifier.Pre-configured threshold values can be used to sim-plify the initialization, but tend to reduce the eventdetection accuracy.

Event detection employing pattern recognitiontechniques has been used in large-scale deploymentsby Duarte and Hu [2] for vehicle classification, byGhasemzadeh et al. [6] to detect specific humanmovement patterns, and by our own system for fencemonitoring (cf. Sec. 4).

3.3 Anomaly Detection

Anomaly detection is a term used for approaches thatfocus on the specific case of detecting whether partic-ularly unusual event has occurred. This is achievedby learning typical system behavior over time andthen classifying specific events as either normal oranomalous. Approaches with this goal expand uponthe methods of the previous two approaches and in-corporate techniques from the fields of intrusion de-tection and bio-inspired artificial immune systems.

One example of a system that specifically de-tects anomalies is the Distributed Event Localiza-tion and Tracking Algorithm (DELTA) proposed byWalchli [11] which uses light sensors to track a mov-ing flashlight.

Our discussion of algorithmic approaches is sum-marized in Table 2. The crucial properties of eventdetection algorithms are the computational overhead(in terms of processing and memory usage) and thescenario-dependent applicability. The computationaloverhead obviously increases as the algorithms be-come more complex. The applicability describesin which circumstances an acceptable accuracy canbe expected from an algorithm. Since applicabilityvaries depending on the deployment scenario, there isnot a single preferable algorithmic approach to eventdetection in general.

4 An Exemplary Event Detec-tion System: AVS-Extrem

Based on the general discussion of architectures andalgorithms, we now present an exemplary platformfor distributed event detection called AVS-Extrem.This platform is employed in a project in whichwireless sensor nodes equipped with accelerometersare integrated into the fence surrounding a security-sensitive area. The goal of this deployment is to de-tect intrusions into this area. The sensor nodes arethus programmed to collaboratively distinguish be-tween numerous events, e.g., opening the fence orclimbing over the fence, based on the movement ofthe fence elements as measured by the sensors.

On the algorithmic side, the system implementsa pattern recognition approach (cf. Sec. 3.2). Eachevent class is described by a prototype that incor-porates the most descriptive features which are ex-tracted from the training data. Training data is gath-ered during on-site training as part of which the sen-sor network is repeatedly exposed to events of eachclass. When deployed, the prototypes allow the sen-sor nodes to distinguish between event classes. Archi-tecturally, this is achieved using a distributed evalua-tion method (cf. Sec. 2.4) during which sensor nodesexchange extracted features and merge them into pro-totypes that span multiple sensor nodes.

4.1 Hardware Platform

The AVS-Extrem sensor nodes, as depicted inFigure 2a, were developed with motion-centricand localization-dependent applications in mind [5].Each node is equipped with an ARM7-based NXPLPC2387 MCU with 96 KB RAM and 512 KB ROMrunning at a maximum of 72 MHz. For communica-tions, a Texas Instruments CC1101 radio transceiveroperating in the 868 MHz ISM band is used. A BoschSMB380 low-power triaxial acceleration sensor witha range of up to 8 g, 10-bit resolution, and a sam-pling rate of up to 1,500 Hz is employed for acquiringmotion data.

In addition to these core components, the PCB alsoincludes a coulomb counter to measure battery usage,

6

ISM antenna

TI CC1101

Bosch SMB380

Socket for optional

FSA03 GPS module

Norman Dziengel, Freie Universität Berlin ICCE-Berlin ‘11 07.09.2011 2

3 cm

Bosch SMB380

3D-ACC sensor

NXP LPC2387 ARM7 MCU

Reset button

Sensirion SHT11

RH/T sensor

SD card socket

USB connector

FTDI FT232R USB UART IC

LEDs

Linear Technology LTC4150

Coulomb counter

Power supply

JTAG connectorConnector for ext. modules

10

cm

External pins for research

(a) Sensor node

Distributed

Event Detection

Ap

pli

cati

on

La

ye

r

Prototype

ClassifierPreprocessing

Feature

Extraction

Data Quality Estimator ACC- Calibration

System Architecture

Norman Dziengel, Freie Universität Berlin ICCE-Berlin ‘11 07.09.2011 3

Micro-Mesh

Routing

MCU: ARM7ACC-Sensor

Syst

em

La

ye

rSecurity

HW: AVS

Sensor Board

Ap

pli

cati

on

La

ye

rOS:

FireKernelMultiple

Base Nodes

Housing

En

erg

y

Ma

na

ge

me

nt

(b) System architecture

Figure 2: AVS-Extrem platform

an SD card slot to store trained motion patterns, andoptionally a GPS receiver to establish the location ofthe node. The rugged housing fixes the node in placewithin vertical fence bars and enables us to conducthighly repeatable experiments.

The MCU can be woken up from power-saving sta-tus by the radio transceiver, the accelerometer, anda scheduled interrupt timer from the RTC. These ex-ternal interrupts are raised when a packet is received(wake on radio, WOR), the accelerometer recognizesa change in position of the node, or a scheduled OS-level task needs to be run. Hence, only the radiotransceiver, the accelerometer, and the RTC consumeenergy, while all other components of the node remainin low-power modes most of the time.

4.2 System Architecture

As illustrated in Figure 2b, the system architectureis divided into two layers: the application layer andthe system layer. The system layer contains the AVS-Extrem sensor board, the operating system and theenergy management. The FireKernel OS [12] is a low-power operating system that supports threading andpriority-based preemptive multitasking. The latter isnecessary to support uninterrupted sampling of thesensor while at the same time communicating with

other nodes. A flexible energy management systemsupports WOR and numerous MCU-specific energysaving techniques.

For multi-hop communications, the nodes employthe reactive Micro Mesh Routing (MMR) protocol.We have adapted this protocol to our use case byadding the capability to reliably communicate withthe base station of the sensor network via a set ofmultiple base nodes. Each of these nodes can beaddressed by the network as a virtual base stationand transparently relays packets to the real basestation, which is connected to all base nodes viawireless or wired connections. This way, a fault-tolerant connection between any node in the WSNand the base station is provided. For secured commu-nications, we have implemented the ParSec securitylayer that ensures confidentiality, authenticity, andintegrity of transmitted data by using a symmetriccipher-algorithm in CBC mode.

The application layer encompasses a calibrationroutine for the acceleration sensor, optionally a dataquality estimator, and the algorithmic core of ourdistributed event detection system. The task of thecalibration routine is to compensate for the individ-ual positional setup of the fence elements after ev-ery event. The data quality estimator employs the

7

3. Feature Extraction

node n-1node nnode n+1

2. Preprocessing

1. Event

Norman Dziengel, Freie Universität Berlin ICCE-Berlin ‘11 07.09.2011 8

3. Feature Extraction

5. Classification

4. Feature Distribution

Event

6. Report

Figure 3: The distributed event detection process

Dempster-Shafer theory in order to establish confi-dence levels for each measured data point, and, bydoing so, allows the system to ignore the readingsof malfunctioning sensors. This provides additionalinput for the feature fusion module in the event de-tection system, as detailed in the next section.

4.3 Event Detection Algorithm

The distributed event detection system observes andevaluates events using multiple sensor nodes [13].Events are recognized directly on the nodes in thenetwork, i.e., without the need for a base station orother means of central coordination and processing.This is advantageous over other approaches to eventdetection in WSNs which transmit raw data to an ex-ternal entity for evaluation or rely on simplistic pat-tern recognition schemes (cf. Secs. 2 and 3).

The general idea of our event detection system isthat each node has its own view of any given event,but only for one node this view corresponds to one ofthe events it was trained to recognize. As illustratedin Figure 3, the nodes are attached to the fence atfixed inter-node distances. Whenever an event oc-curs, a lateral oscillation is propagated through theinterconnected fence elements. The nodes thus sam-ple the movement of the fence at different relative po-sitions to the source of the event. If an event occurs,all affected nodes exchange the extracted features andconcatenate their own features and the received fea-tures from the neighbors based on the relative senderposition. This concatenated feature vector is thencompared to a set of previously trained prototypevectors, and an event is reported in case of a match.

For a typical deployment, the system operates inthree distinct phases: In the calibration phase, the

8

aforementioned calibration routine automatically cal-ibrates the acceleration sensors. The training phaseis initiated by a user via a control station to setupa newly deployed system. The purpose of this phaseis to train the WSN to recognize application-specificevents by executing a set of representative trainingevents for each event class. Every event class istrained by extracting the same set of features fromthe sampled raw data on all sensor nodes, and thentransmitting it to the control station. An adequatelylarge pool of available features is crucial for the over-all performance of the event detection system, since– depending on the characteristics of the deployment– some subsets of features is vastly superior in distin-guishing between events than others. The control sta-tion performs a leave-one-out cross validation acrossall collected features to automatically select the opti-mal subset of features for this specific deployment. Aprototype containing the most expressive features foreach event class is then calculated and transmittedback to each sensor node.

Once the training is complete, the system entersthe detection phase. In this phase, the nodes are con-figured to autonomously recognize the trained eventsand the control station is no longer needed. As illus-trated in Figure 3, the sensors gather data, which isthen preprocessed, segmented, and normalized. Onlyfeatures used in the prototype vectors are extractedfrom the raw data, and then combined to form a fea-ture vector. The extracted feature vector is thenflooded to all sensor nodes in the n-hop neighbor-hood. In our deployments, n is usually set to 1 sincethe radio range of our sensor nodes is significantlylarger than the spacial expansion of the fence motioncaused by an event. After receiving all required fea-ture vectors, each node performs a feature fusion bycombining the feature vectors based and the relativeposition of the sender. Each node establishes the rel-ative position of other nodes by evaluating the uniquenode IDs of received packets. As part of this overallprocess, each node builds its own view of the eventusing feature vectors from its neighboring nodes. Ob-viously, the combined feature vector is different foreach node, because the respective neighboring nodesperceive the event differently depending on their lo-cation. The prototype classifier running on the node

whose view of the event matches the trained viewclassifies the event correctly, while the feature vectorrepresents an unknown pattern for the other nodes,which in turn ignore the event. If deemed relevant tothe use case, e.g., the event of climbing over the fencein our scenario, the node that correctly classified theevent transmits a packet reporting the detection tothe base station.

5 Experimental Evaluation

In this section, we present the results from our exper-imental evaluation of the detection accuracy and theenergy efficiency of the AVS-Extrem platform. Theseresults summarize four major experiments that weconducted over the past years [14, 4, 13, 5].

5.1 Detection Accuracy

We evaluate the detection accuracy using three ex-periments: an initial proof-of-concept deployment, alab prototype, and a real-world deployment.

In the proof-of-concept deployment [14], we at-tached ten ScatterWeb MSB sensor nodes, the pre-decessor of our current AVS-Extrem node, to a fenceand exposed them to six different events: long andshort shaking of the fence, kicking against the fence,leaning against the fence, and peeking and climbingover the fence. The proof-of-concept system did notsupport autonomous training, and hence we relied ona custom-built heuristic classifier implemented in arule-based middleware. This classifier was manuallyconfigured to classify events based on visible patternsin the raw data.

As part of the lab experiment [4], we trained threesensor nodes to cooperatively recognize four differentmotion paths based on the acceleration data mea-sured by the sensor. The four motion paths, com-prising a square, a triangle, a circle and the capitalletter U, were drawn simultaneously by three personson a flat surface by moving the sensor nodes. Eventdetection was implemented using a cooperative fu-sion classifier which, for the first time, allowed forthe feature vectors to include features from multiplesensor nodes.

9

0

20

40

60

80

100

Proof of Concept Lab Prototype Real World Deployment

%

Sensitivity Specificity PPV NPV Accuracy

(a) Detection quality of distributed event detection for differ-ent deployments

20

30

40

50

60

70

0

10

20

No Event Detection Centralized Event

Detection

Decentralized Event

Detection

Power Consumption in mW Lifetime in Weeks

(b) Energy consumption and network lifetime of differentevent detection systems

Figure 4: Detection quality and energy consumption of different approaches to event detection

Finally, in our real-world deployment [13], we at-tached 100 sensor nodes to the fence of a construc-tion site. We exposed the WSN to four differentclasses of events: shaking the fence, kicking againstthe fence, leaning against the fence, and climbingover the fence. This deployment utilized the sameevent detection algorithm as the lab prototype, butadditionally introduced automated feature selectionbased on leave-one-out cross-validation (cf. Sec. 4.3)during the training phase. For the training, we chosea region of the fence that was free of any externalobstructions and trained the WSN with 15 events ofeach class.

Figure 4a shows sensitivity, specificity, PositivePredictive Value (PPV), Negative Predictive Value(NPV), and accuracy for these three experiments.1

The figure illustrates that AVS-Extrem achieves near-perfect results under lab conditions. Further, thefinal real-world deployment performs substantiallybetter then the initial proof-of-concept setup with therule-based classifier. The current system achieves anoverall detection accuracy of 87.1%, an improvementof 28.8% over the proof-of-concept.

5.2 Energy Consumption

In contrast to the detection accuracy experiment, theenergy consumption is evaluated with the succeedinghardware platform AVS-Extrem as introduced in Sec-tion 4. In our experiments in [5], we evaluated the

1Definitions of these metrics are given in [13].

energy consumption of a sensor node during the dis-tributed event detection and recognition process. Tomeasure the energy consumption of the whole boardas accurately as possible, we soldered a 10 Ω shunt re-sistor into the supply line which is powered by a refer-ence voltage of 5 V. The voltage of the shunt resistorwas measured using a digital sampling oscilloscope(DSO). As the resistor and the voltage are known, wecan calculate the value of the current and use it tocalculate the electric power used by the sensor nodeover the time of one DSO sample. By integratingthe electric power over the time of one system state,e.g., packet transmission or IDLE mode, we can mea-sure the energy needed. This information can thenbe used to approximate the energy consumption ofthe sensor node over a given time (cf. [5]). Duringthe event detection phase, the sensor nodes use theMCU power down (PD) mode, that also shuts downall internal peripherals. The wireless transceiver usesthe wake-on-radio (WOR) mode and thus remainsavailable for processing incoming data. The acceler-ation sensor is active and monitors the movement ofthe fence.

Figure 4b illustrates the average energy consump-tion and the resulting extrapolated network lifetime.The underlying scenario for this calculation is a de-ployment of seven nodes for which an event is gener-ated and processed five times per hour. For compari-son, we also plot the numbers for a deployment of thesame sensor network, but without any event-relatedactivity. As can been seen in the figure, the lifetime

10

of the network that employs centralized event detec-tion is drastically reduced in comparison to the idlenetwork due to increase energy consumption. In con-trast, distributed event detection is able to reduce theper-node energy consumption by 77%, thus increas-ing the lifetime of the network to an extrapolatedtotal of 41 weeks.

These results underline the validity of our initialhypothesis, that distributed event detection is ablejointly achieve the otherwise conflicting goals of high-accuracy event detection and long network lifetime.

6 Conclusion and Outlook

In this paper, we described how event detection inwireless sensor networks comes in several differentforms, each with its specific trade-offs and, at times,particularly suited for certain types of deployments.The overall dominating factor is the trade-off be-tween energy efficiency and event detection accuracy,two goals which are mutually exclusive for simple ap-proaches. Systems that operate distributively, suchas the AVS-Extrem platform which we used as anexample in this paper, pave the way towards deploy-ments that can provide both, highly accurate eventdetection as well as a long network lifetime.

As distributed approaches to event detection ma-ture, we expect that a new, deployment-time trade-off will emerge: System that adapt techniques frommachine learning (such as AVS-Extrem) profit fromtheir inherent adaptability to specific scenarios, buton the downside also require extensive training foreach new deployment. Other types of systems relyon on-site parametrization which can be performedquickly, but is unable to detect complex events. Itwill thus be interesting to explore in how far eventdetection systems can not only achieve high accuracyat low power consumption, but furthermore also beefficiently deployed without sacrificing either of theaforementioned properties.

Acknowledgments

This work was funded in part by the German FederalMinistry of Education and Research (Bundesminis-terium fur Bildung und Forschung, BMBF) throughthe project AVS-Extrem.

References

[1] D. M. Doolin and N. Sitar. Wireless Sensors forWildfire Monitoring. In Proceedings of the SPIESymposium on Smart Structures & Materials /NDE ’05, San Diego, CA, USA, Mar. 2005.

[2] M. F. Duarte and Y. H. Hu. Vehicle Classifica-tion in Distributed Sensor Networks. Journal ofParallel and Distributed Computing, 64(7), July2004.

[3] R. O. Duda, P. E. Hart, and D. G. Stork. PatternClassification. Wiley-Interscience, 2nd edition,Nov. 2000.

[4] N. Dziengel, G. Wittenburg, and J. Schiller.Towards Distributed Event Detection in Wire-less Sensor Networks. In Adjunct Proceedingsof the Forth IEEE/ACM International Confer-ence on Distributed Computing in Sensor Sys-tems (DCOSS ’08), Santorini Island, Greece,June 2008.

[5] N. Dziengel, M. Ziegert, S. Adler, Z. Kasmi,S. Pfeiffer, and J. Schiller. Energy-Aware Dis-tributed Fence Surveillance for Wireless SensorNetworks. In Proceedings of the Seventh Inter-national Conference on Intelligent Sensors, Sen-sor Networks and Information Processing (ISS-NIP ’11), Adelaide, Australia, Dec. 2011.

[6] H. Ghasemzadeh, V. Loseu, and R. Jafari. Col-laborative Signal Processing for Action Recog-nition in Body Sensor Networks: A DistributedClassification Algorithm Using Motion Tran-scripts. In Proceedings of the Ninth ACM/IEEEInternational Conference on Information Pro-cessing in Sensor Networks (IPSN ’10), Stock-holm, Sweden, Apr. 2010.

11

[7] L. Gu, D. Jia, P. Vicaire, T. Yan, L. Luo,A. Tirumala, Q. Cao, T. He, J. A. Stankovic,T. Abdelzaher, and B. H. Krogh. LightweightDetection and Classification for Wireless SensorNetworks in Realistic Environments. In Pro-ceedings of the Third International Conferenceon Embedded Networked Sensor Systems (Sen-Sys ’05), San Diego, CA, USA, Nov. 2005.

[8] Y. Kim, J. Kang, D. Kim, E. Kim, P. Chong, andS. Seo. Design of a Fence Surveillance SystemBased on Wireless Sensor Networks. In Proceed-ings of the Second International Conference onAutonomic Computing and Communication Sys-tems (Autonomics ’08), Turin, Italy, Sept. 2008.

[9] F. Martincic and L. Schwiebert. DistributedEvent Detection in Sensor Networks. In Proceed-ings of the International Conference on Systemsand Networks Communications (ICSNC ’06),Tahiti, French Polynesia, Oct. 2006.

[10] A. Tavakoli, J. Zhang, and S. H. Son. Group-Based Event Detection in Undersea Sensor Net-works. In Proceedings of the Second Interna-tional Workshop on Networked Sensing Systems(INSS ’05), San Diego, CA, USA, June 2005.

[11] M. Walchli, P. Skoczylas, M. Meer, andT. Braun. Distributed Event Localizationand Tracking with Wireless Sensors. In Pro-ceedings of the Fifth International Conferenceon Wired/Wireless Internet Communications(WWIC ’07), Coimbra, Portugal, May 2007.

[12] H. Will, K. Schleiser, and J. Schiller. A Real-time Kernel for Wireless Sensor Networks Em-ployed in Rescue Scenarios. In Proceedings ofthe 34th IEEE Conference on Local ComputerNetworks (LCN ’09), Zurich, Switzerland, Oct.2009.

[13] G. Wittenburg, N. Dziengel, C. Wartenburger,and J. Schiller. A System for Distributed EventDetection in Wireless Sensor Networks. In Pro-ceedings of the Ninth ACM/IEEE InternationalConference on Information Processing in SensorNetworks (IPSN ’10), Stockholm, Sweden, Apr.2010.

[14] G. Wittenburg, K. Terfloth, F. L. Villafuerte,T. Naumowicz, H. Ritter, and J. Schiller. FenceMonitoring - Experimental Evaluation of a UseCase for Wireless Sensor Networks. In Proceed-ings of the Forth European Conference on Wire-less Sensor Networks (EWSN ’07), Delft, TheNetherlands, Jan. 2007.

[15] A. Yang, S. Iyengar, S. S. Sastry, R. Bajcsy,P. Kuryloski, and R. Jafari. Distributed Seg-mentation and Classification of Human ActionsUsing a Wearable Motion Sensor Network. InProceedings of the IEEE Conference on Com-puter Vision and Pattern Recognition Work-shops (CVPRW ’08), Anchorage, AK, USA,June 2008.

12