illumimote: multimodal and high-fidelity light sensor module for wireless sensor networks

8
996 IEEE SENSORS JOURNAL, VOL. 7, NO. 7, JULY 2007 Illumimote: Multimodal and High-Fidelity Light Sensor Module for Wireless Sensor Networks Heemin Park, Student Member, IEEE, Jonathan Friedman, Student Member, IEEE, Pablo Gutierrez, Vidyut Samanta, Jeff Burke, and Mani B. Srivastava, Senior Member, IEEE Abstract—We describe the system requirements, design, system integration, and performance evaluation of the Illumimote, a new light-sensing module for wireless sensor networks. The Illumimote supports three different light-sensing modalities: incident light intensity, color intensites, and incident light angle (the angle of ray arrival from the strongest source); and two situational sensing modalities: attitude and temperature. The Illumimote achieves high performance, comparable to commercial light meters, while conforming to the size and energy constraints imposed by its application in wireless sensor networks. We evaluated the perfor- mance of our Illumimote for light intensity, color temperature, and incident light angle measurements and verified the function of the attitude sensor. The Illumimote consumes about 90mW when all features on board are activated. We describe our design and the experiment design for the performance evaluation. Index Terms—Embedded systems, light sensors, wireless sensor networks. I. INTRODUCTION W IRELESS sensor networks (WSNs) composed of tiny embedded systems each with a processor, sensor, radio, and battery are already pervasively employed in many areas in- cluding target tracking and habitat monitoring [1], but they also have exciting applications in the arts, multimedia, and entertain- ment [2]. Poor sensing quality, fidelity, and diversity of currently available sensor modules have limited such expansion into these new areas like film, video production [3], [4], and lighting con- trol applications [5]. To support research and development in these WSN areas, high-fidelity light-sensing modules in a com- pact form factor are required. The Illumimote was conceived in a joint effort by filmmakers and engineers to apply the evolving technologies of WSN to new purposes in art and entertainment. WSN offers many unique opportunities for improving the creative and business processes of entertainment. One example is the continuity management Manuscript received March 15, 2006; revised September 6, 2006. This work was supported in part by the National Science Foundation under Award CNS- 0306408; the Center for Embedded Network Sensing, University of California, Los Angeles; and Intel Corporation. Expanded from a paper presented at the Sensors 2005 Conference. The associate editor coordinating the review of this paper and approving it for publication was Dr. Giorgio Sberveglieri. H. Park, J. Friedman, and M. B. Srivastava are with the Networked and Embedded Systems Laboratory, Electrical Engineering Department, University of California, Los Angeles, Los Angeles, CA 90095-1594 USA (e-mail: [email protected]; [email protected]; [email protected]). P. Gutierrez, V. Samanta, and J. Burke are with the Center for Research in En- gineering, Media and Performance, School of Theater, Film and Television, Uni- versity of California, Los Angeles, Los Angeles, CA 90095-1622 USA (e-mail: [email protected]; [email protected]; [email protected]). Color versions of one or more of the figures in this paper are available online at http://ieeexplore.ieee.org. Digital Object Identifier 10.1109/JSEN.2006.886999 of lighting: the order in which an audience views a film’s se- quence of events is remarkably different from the order in which they are produced. Shots are filmed in the order that minimizes cost and makes best use of actors, crew, and locations. Footage captured at these different times must appear the same when shown consecutively, or differences must be controllable if they are required for creative purposes. Therefore, it is important to monitor and replicate the quality of light (illuminance and color) in each shot, so that footage captured at different times or in different locations does not show unexpected differences, which may not be perceived by the human eye but affect the film stock. The Lord of the Rings trilogy, for example, was filmed over a year and a half of production and required that footage be captured for use in three different movies with vastly dif- fering release dates and schedules. Even with a staff of more than 2400 people, maintaining continuity was remarkably diffi- cult, as notes had to be taken by hand and conditions were con- stantly changing. This is not uncommon, and many large-budget feature films require significant postproduction digital image manipulation prior to release, which is quite expensive. Conti- nuity management is required for props, scenery, and actor and camera information, as well as lighting, though the term is typi- cally applied to management of nontechnical elements. We have focused on lighting instrumentation as the first component of our Advanced Technology for Cinematography (ATC) [6] be- cause of its vital role in the creative process of filmmaking. ATC [6] is a joint project of the Henry Samueli School of En- gineering and Applied Science and the School of Theater, Film and Television, both at the University of California, Los Angeles (UCLA). The Illumimote is part of their larger vision to increase flexibility and creative control in media production using sensor networks and other emerging technologies. Deploying networks of tiny sensors adds a data acquisition layer to the film pro- duction environment that supports on-set decision making, such as the lighting adjustment described above, as well as postpro- duction and asset management. Another component of the ATC project is UCLA’s Augmented Recording System (ARS) [4]. In ARS, wireless sensors can be deployed onto a set to collect data in synchrony with the film or video frame rate. ARS provides a framework for establishing a correlation between the media footage and sensor data. Initial work with ARS has prompted the development of the Illumimote and other high-quality sensor platforms that can be deployed atop Mica motes, the de facto standard for WSN nodes. Their development addresses limita- tions of sensors currently available for this platform. For ex- ample, the light sensors on the MTS310 and MTS400 1 are inad- equate for high-fidelity applications, being too sensitive to in- 1 Data sheet available at http://www.xbow.com/. 1530-437X/$25.00 © 2007 IEEE

Upload: mani-b

Post on 04-Mar-2017

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Illumimote: Multimodal and High-Fidelity Light Sensor Module for Wireless Sensor Networks

996 IEEE SENSORS JOURNAL, VOL. 7, NO. 7, JULY 2007

Illumimote: Multimodal and High-Fidelity LightSensor Module for Wireless Sensor Networks

Heemin Park, Student Member, IEEE, Jonathan Friedman, Student Member, IEEE, Pablo Gutierrez,Vidyut Samanta, Jeff Burke, and Mani B. Srivastava, Senior Member, IEEE

Abstract—We describe the system requirements, design, systemintegration, and performance evaluation of the Illumimote, a newlight-sensing module for wireless sensor networks. The Illumimotesupports three different light-sensing modalities: incident lightintensity, color intensites, and incident light angle (the angle ofray arrival from the strongest source); and two situational sensingmodalities: attitude and temperature. The Illumimote achieveshigh performance, comparable to commercial light meters, whileconforming to the size and energy constraints imposed by itsapplication in wireless sensor networks. We evaluated the perfor-mance of our Illumimote for light intensity, color temperature,and incident light angle measurements and verified the function ofthe attitude sensor. The Illumimote consumes about 90mW whenall features on board are activated. We describe our design andthe experiment design for the performance evaluation.

Index Terms—Embedded systems, light sensors, wireless sensornetworks.

I. INTRODUCTION

WIRELESS sensor networks (WSNs) composed of tinyembedded systems each with a processor, sensor, radio,

and battery are already pervasively employed in many areas in-cluding target tracking and habitat monitoring [1], but they alsohave exciting applications in the arts, multimedia, and entertain-ment [2]. Poor sensing quality, fidelity, and diversity of currentlyavailable sensor modules have limited such expansion into thesenew areas like film, video production [3], [4], and lighting con-trol applications [5]. To support research and development inthese WSN areas, high-fidelity light-sensing modules in a com-pact form factor are required.

The Illumimote was conceived in a joint effort by filmmakersand engineers to apply the evolving technologies of WSN tonew purposes in art and entertainment. WSN offers many uniqueopportunities for improving the creative and business processesof entertainment. One example is the continuity management

Manuscript received March 15, 2006; revised September 6, 2006. This workwas supported in part by the National Science Foundation under Award CNS-0306408; the Center for Embedded Network Sensing, University of California,Los Angeles; and Intel Corporation. Expanded from a paper presented at theSensors 2005 Conference. The associate editor coordinating the review of thispaper and approving it for publication was Dr. Giorgio Sberveglieri.

H. Park, J. Friedman, and M. B. Srivastava are with the Networked andEmbedded Systems Laboratory, Electrical Engineering Department, Universityof California, Los Angeles, Los Angeles, CA 90095-1594 USA (e-mail:[email protected]; [email protected]; [email protected]).

P. Gutierrez, V. Samanta, and J. Burke are with the Center for Research in En-gineering, Media and Performance, School of Theater, Film and Television, Uni-versity of California, Los Angeles, Los Angeles, CA 90095-1622 USA (e-mail:[email protected]; [email protected]; [email protected]).

Color versions of one or more of the figures in this paper are available onlineat http://ieeexplore.ieee.org.

Digital Object Identifier 10.1109/JSEN.2006.886999

of lighting: the order in which an audience views a film’s se-quence of events is remarkably different from the order in whichthey are produced. Shots are filmed in the order that minimizescost and makes best use of actors, crew, and locations. Footagecaptured at these different times must appear the same whenshown consecutively, or differences must be controllable if theyare required for creative purposes. Therefore, it is importantto monitor and replicate the quality of light (illuminance andcolor) in each shot, so that footage captured at different timesor in different locations does not show unexpected differences,which may not be perceived by the human eye but affect the filmstock. The Lord of the Rings trilogy, for example, was filmedover a year and a half of production and required that footagebe captured for use in three different movies with vastly dif-fering release dates and schedules. Even with a staff of morethan 2400 people, maintaining continuity was remarkably diffi-cult, as notes had to be taken by hand and conditions were con-stantly changing. This is not uncommon, and many large-budgetfeature films require significant postproduction digital imagemanipulation prior to release, which is quite expensive. Conti-nuity management is required for props, scenery, and actor andcamera information, as well as lighting, though the term is typi-cally applied to management of nontechnical elements. We havefocused on lighting instrumentation as the first component ofour Advanced Technology for Cinematography (ATC) [6] be-cause of its vital role in the creative process of filmmaking.

ATC [6] is a joint project of the Henry Samueli School of En-gineering and Applied Science and the School of Theater, Filmand Television, both at the University of California, Los Angeles(UCLA). The Illumimote is part of their larger vision to increaseflexibility and creative control in media production using sensornetworks and other emerging technologies. Deploying networksof tiny sensors adds a data acquisition layer to the film pro-duction environment that supports on-set decision making, suchas the lighting adjustment described above, as well as postpro-duction and asset management. Another component of the ATCproject is UCLA’s Augmented Recording System (ARS) [4]. InARS, wireless sensors can be deployed onto a set to collect datain synchrony with the film or video frame rate. ARS providesa framework for establishing a correlation between the mediafootage and sensor data. Initial work with ARS has promptedthe development of the Illumimote and other high-quality sensorplatforms that can be deployed atop Mica motes, the de factostandard for WSN nodes. Their development addresses limita-tions of sensors currently available for this platform. For ex-ample, the light sensors on the MTS310 and MTS4001 are inad-equate for high-fidelity applications, being too sensitive to in-

1Data sheet available at http://www.xbow.com/.

1530-437X/$25.00 © 2007 IEEE

Page 2: Illumimote: Multimodal and High-Fidelity Light Sensor Module for Wireless Sensor Networks

PARK et al.: ILLUMIMOTE: LIGHT SENSOR MODULE FOR WIRELESS SENSOR NETWORKS 997

frared radiation and lacking the necessary dynamic range. Toour knowledge, the Illumimote is now the fastest, lowest power,most accurate, and most replete light sensor available in the fieldof WSN.

The rest of this paper is organized as follows. Section IIpresents system requirements of the light-sensing module forour application. In Section III, the implementation of the sensormodule is described. Section IV presents calibration methodswe used. Experimental setup and performance evaluation arepresented in Section V, and Section VI concludes this paper.

II. SYSTEM REQUIREMENTS

The Illumimote is designed to have performance equal to orbetter than the class of commercial light intensity and color tem-perature meters used in the entertainment, film, and video pro-duction industries. The initial design discussed in this paper willcontinue to be refined toward this goal. Such performance is re-quired to provide confidence in the device for its target audience,allowing it to be used for basic metering tasks in real production.

The capabilities gained by the choice of a standard wire-less sensor network platform and the Illumimote’s high-fidelitysensing performance will allow us to explore new techniquesand approaches to measuring light, one of the most practicallyand creatively important elements of media production. Withthese capabilities, we also expect the Illumimote may proveuseful in industrial and commercial applications that requiremany physically distributed light measurements, such as con-struction and video projection calibration.

Design criteria for the Illumimote include the following ca-pabilities. First, we list system requirements designed to matchexisting instruments in a significantly smaller form factor.

• Light intensity and color temperature sensing: The deviceshould be capable of measuring both incident light inten-sity and color temperature in a single package, to simplifydeployment and use.

• Robustness against infrared energy: Both sunlight andvery high-power incandescent fixtures are common lightsources in media production. The sensors must be ban-dlimited so as not to be unexpectedly saturated by infraredenergy.

• Wide dynamic range: Our final design target for theIllumimote’s dynamic range is to enable measurementin standard indoor production settings (up to 1000 lux)through very bright outdoor environments (up to 100 000lux). The first revision of the Illumimote provides a2–18 500 lux dynamic range for light intensity, with colortemperature measurements available throughout this rangeof incident intensities. This is sufficient for all indoor,high-intensity studio environments, and morning/eveningoutdoor conditions.

• Fast response time: The frame rate of NTSC televisionvideo is 29.97 frames per second (fps) and the duration ofa television frame is approximately 33.37 ms. (Film typ-ically uses a slightly slower frame rate of 24 fps.) There-fore, the Illumimote should be able to capture and processthe light information in 33.37 ms to capture light changeswithin one video or film frame. (The ARS system discussed

in the Introduction allows these measurements to be syn-chronized with film and video capture.) Higher frame rateswould support special effects photography and other spe-cialized applications.

• High accuracy: The Illumimote is designed to match theaccuracy of commercial meters, and exceed it if possible,to support novel future uses we hope to discover duringdeployment. This requirement is verified experimentally inlater sections.

As the Illumimote has performance of commercial light me-ters in a wireless sensor network platform, there are additionalrequirements for the features beyond current commercial tech-nologies of light meters.

• Wireless sensor network compatibility: A primary designrequirement was that the Illumimote be compatible withthe Mica mote from Crossbow, a common platform in wire-less sensor network research and development. This allowsus to explore how the benefits of sensor networks—forexample, low power consumption, small form factor, dis-tributed computation, and wireless communication—cansupport the target domain of media production.

• Proprioceptive sensing: Elsewhere, we suggest that pro-prioceptive sensing [2] is an important role of embeddeddevices in expressive applications like media production.This and other key roles of embedded devices requiresthem to have some knowledge of their own placementand area of observation or region of responsibility (RoR).Along with its incident light angle sensors, the Illumi-mote’s situational sensors—a two-axis accelerometer fororientation information and a temperature sensor—pro-vide information in support of basic light measurementsallowing for some knowledge of the sensor’s RoR andsurrounding environment.

III. DESIGN OF THE ILLUMIMOTE

A. Light Sensors

According to the different sensing modalities described in theSection II, the Illumimote’s data acquisition capabilities coverthe three principle attributes of illumination: signal strength (in-tensity), frequency (color), and transmission vector (incidentlight angle and sensor attitude).

• Incident light intensity sensor: The Illumimote acquires in-cident light intensity with the precision of a commerciallight meter (Sekonic L-558Cine2 light meter was used asthe reference), for which both dynamic range and accuracyare of interest. The principal detector is a Hamamatsu sil-icon photodiode S11333 chosen for its comparatively largeactive area. This type of diode increases the signal-to noiseratio (SNR) for low-intensity measurements and allowsfor reduced power consumption under high-intensity con-ditions (when the sensitivity control unit, discussed later,is used). Further, it is surface coated with an infrared-cutfilm so as to achieve a spectral response range from 320 to730 nm (e.g., visible bandlimited).

2http://www.sekonic.com/Products/L-558Cine.html.3http://usa.hamamatsu.com.

Page 3: Illumimote: Multimodal and High-Fidelity Light Sensor Module for Wireless Sensor Networks

998 IEEE SENSORS JOURNAL, VOL. 7, NO. 7, JULY 2007

Fig. 1. Photos of the Illumimote [1]. (a) Prototype Illumimote; (b) fabricatedIllumimote with attachment of lumisphere; (c) front side of Illumimote; and (d)back side.

• Color intensity sensors: Color intensity sensors for red,green, and blue colors are used to calculate color tem-perature [7]. We adopted Hamamatsu S6428-01 (red),S6429-01 (green), and S6430-01 (blue), for similar rea-sons as the S1133. Calculation of color temperature andcalibration of RGB sensors are discussed in Section IV.

• Incident light angle sensors: The determination of theangle to the strongest incident light source involves a pairof Hamamatsu S6560 sensors Each component includesdual photodiodes with a vertical barricade separatingthem. Consequently, the position of the light, relative tothe shadow cast by the differential illumination, impliesthe angle to the source along one axis. On the Illumimote,the pair of sensors are oriented orthogonally to create theX-Y basis vectors.

• Situational sensors: Additional sensors are includedon-board to provide richer proprioceptive information[2] on the operating status of the device. A gravity-basedattitude sensor (accelerometer) is included to allow forEarth-plane relative transformation in the event that thesensor is not oriented parallel to the ground. A temperaturesensor is also included to detect overheating conditionsthat might occur under high-intensity lighting.

B. System Architecture and Implementation

Our Illumimote has evolved from an early prototype shownin Fig. 1(a), which adopted a simple pull-down resistor photo-diode bias circuit and instrumentation amplifier architecture, tothe recently fabricated version shown in Fig. 1(b). The latter

Fig. 2. Architecture of the Illumimote [1].

employs a two-stage active suppression power supply, dynami-cally configurable photodiode bias (sensitivity control), and sit-uational sensor unit. The overall system architecture diagram ofthe Illumimote appears in Fig. 2, in which only one light sensorchannel is shown. There are eight light sensor channels allocatedbased on the number of detector circuits required to capture theillumination attribute. For example, the color temperature unitrequires three channels—one for each of red, green, and blue lu-minosity. Signals from the eight light-acquisition units and foursituational units are multiplexed via the channel selection unitand presented to the analog-to-digital converter (ADC) for con-version into a 10-bit digital signal. This resultant data are con-veyed to the networked and embedded nodes (in our case, MicaZmotes) via either the I2C data bus or a direct 16550A-compatibleUART link that uses line-level (rail-to-rail) output. The opera-tion of the Illumimote’s units may be controlled directly fromthe mote via the I2C bus or locally by an onboard Atmel At-mega48 microprocessor. Employing the local processor relievesthe network interface (mote) of any real-time constraints asso-ciated with frame-rate-accurate sampling. The local processoralso exposes interrupt facilities both to and from the host-pro-cessor onboard the mote. When operating in this mode, the con-tinuous I2C bus may be severed and reattached dynamically(hardware is bus-state aware) to create two isolated buses—onelocal to the Illumimote, and one local to the mote—as needed.In addition to calibration functions, the embedded temperaturesensor can wake a sleeping mote in the event of a dangerousthermal condition (risk of meltdown).

The assembled Illumimote with a lumisphere appears inFig. 1(b). The role of the lumisphere is to protect the sensorsand to integrate incident light from all directions. On thebottom, Illumimote features a connector that is compatible withMica-type sensor nodes (Mica2, MicaZ, Cricket, etc.).

C. Sensitivity Control

The sensitivity control unit (SCU) extends the dynamic rangeof the photodiodes by adjusting the bias current presented toeach channel. Unlike more traditional gain control, which is ap-plied at the amplifier, sensitivity control extends dynamic range

Page 4: Illumimote: Multimodal and High-Fidelity Light Sensor Module for Wireless Sensor Networks

PARK et al.: ILLUMIMOTE: LIGHT SENSOR MODULE FOR WIRELESS SENSOR NETWORKS 999

without suffering the limitations of the amplifier’s SNR. For ex-ample, the SCU bias circuit may present a larger bias resistanceto the sensor when necessary to produce larger input voltages tothe amplifier at low light levels. This avoids the additional am-plification of the amplifier’s internal thermal and coupled noisethat occurs in the traditional model when the low-light condi-tions require a higher gain setting.

The SCU offers six bias current resistor values spanning fourorders of magnitude from 1 K to 10 M . These final resistorvalues used on the Illumimote were obtained from experimentsperformed by the authors on a set of initial choices. Data fromthese early experiments were used to produce a model that pre-dicted the final resistor values. These choices were then verifiedexperimentally (see Fig. 6) and have been shown to achieve acomparably wide dynamic range of four orders of magnitude(in units of lux—lumens per meter squared).

IV. CALIBRATION

A. Light Intensity Sensor Calibration

Illumimote’s current-mode architecture exhibits a linear re-sponse vastly superior to that of the MTS310, [8]. Consequently,a linear data-fitting method is appropriate for correlation with areference light meter. In order to convert the digitized sensorvalues to light intensity (lux), linear transformation by two co-efficients (i.e., , where is the converted lux value,is the ADC reading, and and are calibration coefficients) wasused. The method to find the optimal coefficients involves threesteps as follows. First, we plot the Illumimote’s ADC readingswith respect to reference lux values measured by a commerciallight meter on two-dimensional plane. Secondly, a linear line(i.e., ) that best represents the plot of the ADC valuesis found by Matlab’s command, which estimates thecoefficients by the least square method. Finally, the calibrationcoefficients and can be obtained by projecting the linear line

onto . The projection is done byand . We collected the ADC output values and cali-brated and for four of Illumimote’s six sensitivity settings.For performance evaluation, the first half of the collected datawas used to calculate the calibration coefficients and .

B. Calibration of Color Temperature

Color temperature of a light source can be defined as theblack-body radiator’s temperature in kelvin that matches the hueof the light source [7].4 However, since many light sources ex-cept incandescent light do not emit radiation like black-body,we instead use correlated color temperature (CCT) to representthe color temperature of the light source.

CCT is a simplified way to characterize the spectral propertiesof a light source by choosing the temperature of an ideal black-body that has the best correlation in terms of predicting how thelight’s spectral curve contributes to its output. This may involvethe image’s capture and projection systems and the human eye’ssensitivity.

The method used to calculate CCT requires sampling the lightsource with three silicon photodiodes, having red, green, and

4See http://en.wikipedia.org/wiki/Color\_temperature.

blue sensitivities. These sensors are modeled while illuminatedby an ideal black-body in order to generate a theoretical locusthat is later used to correlate the sampled light values, which fi-nally gives the closest temperature match. Our calibration pro-cedure adjusts the gains of each RGB values in order to matchthe model at a given color temperature.

Color temperature calibration is a procedure to set the factorsthat convert the RGB raw readouts into RGB relative light in-tensities. For color temperature computations, absolute intensityvalues are not necessary, so an arbitrary factor of one is appliedto the green readout to normalize it; for the red and blue, properfactors are chosen to size them so they represent, in the model,the same color temperature that is measured with a commercialinstrument during these readouts. As long as the sensitivity forall sensors remains the same, the raw values can be transformedwith these factors effectively. If this is not the case, the factorshould be divided by the sensitivity in order to make all sam-ples comparable. These factors can be computed for more thanone lighting condition, and eventually, an average value of all ofthem could be considered.

Another option to get CCT is using Robertson’s method [7],but it requires translation of our RGB readings into the humanXYZ color space. Converting from RGB to XYZ is straightfor-ward, but the equations will refer to an RGB light source, not anRGB light sensor, converted to perceived XYZ color. An optionhere is to consider an RGB light source with the same spectraof our RGB sensors to represent them. Regardless of that, thismethod allows a level of spectral shift, depending on the sam-pled light source, so it was left as a second alternative.

The photodiodes we are using can be unequivocal to describeRGB light sources, but here we are exposing them to unknownlight sources and, in our last method, asking them to representthem as emitting diodes. For spiky light sources, the correlationcan be critical, so a better knowledge of the possible spectra tobe encountered in real film sets can help to predict results inworst cases.

The ideal filters to get accurate CCT are the XYZ tristimulusfilters, which correspond to the human eye’s color perception.They can give unequivocal results, but they are not cheap or evenaccessible.

The key factor to improve this subsystem is to refine themodel of the spectral response of the sensors, especially if anyprotective cover (e.g., lumisphere) is to be installed, and to in-clude the human’s eye perception sensitivities in the correlationprocess. Also, the inclusion of a fourth silicon diode sensor witha different sampling filter can enhance and make the matchingcapabilities more robust.

C. Calculation of Incident Light Angles

Calculation of incident light angle along an axis followsfrom (1), where and represent the output current from activearea and of a light angle sensor

(1)

Utilizing the two orthogonal light angle sensors, we devel-oped the following method to estimate the angle of a light

Page 5: Illumimote: Multimodal and High-Fidelity Light Sensor Module for Wireless Sensor Networks

1000 IEEE SENSORS JOURNAL, VOL. 7, NO. 7, JULY 2007

Fig. 3. Incident light angle estimation [8].

Fig. 4. Experimental system setup.

source projected onto the two-dimensional plane of the Illumi-mote (see Fig. 3). First, calculate the two angles andalong X and Y axes, respectively, using (1). The vectors oftwo planes that embed the line from the Illumimote to the lightsource and intersect X and Y axes can be obtained by (2)

(2)

By calculating the cross-product of the two vectors from (2), thevector of the line from the Illumimote to the light source can becalculated as in (3)

(3)

Therefore, the angle is calculated as follows:

(4)

V. EXPERIMENTAL RESULTS

A. Experimental System Setup

To evaluate the Illumimote, we integrated a wireless sensingsystem with it. The experimental setup is shown in Fig. 4. Fora light source, we used a tungsten-balanced incandescent lamp

that generates a color temperature near 3200 K and can provideabout 3000 lux brightness at a distance of 6 ft. This is a verycommon light source in film sets and has a well-defined andvery specific color temperature. With this experimental setup,we compared the Illumimote for incident light intensity andcolor temperature measurements. To generate diverse bright-ness, we placed our Illumimote at 11 different points from 6 ftthrough 36 ft away from the light source in 3 ft steps. We set uptwo brightness settings for wide intensity ranges: one for brightlight intensities from 100 lux up to 3000 lux and the other forlow intensities from 130 lux down to 4.7 lux. At each point, lightintensities were measured wirelessly by our Illumimote. For ref-erence lux value, a Sekonic L-558Cine light meter was used. Forthe color temperature measurements, we fixed the location of theIllumimote at 6 ft and applied different several kinds of colorfilters (gels) to generate lights that have 16 different color tem-peratures. We used a Konica Minolta Color Meter IIIf5 for mea-suring the ground truth color temperature for each light setting.

Three embedded software components were developedfor the experimental wireless sensing system. First, we pro-grammed a sensor and sensitivity control software and down-loaded it to the Illumimote board. We attached our Illumimoteboard on a MicaZ node that has a 7.37MHz 8-bit micropro-cessor and a 250 kbps ZigBee radio Secondly, an Illumimotedriver and a light-sensing application were programmed atthe MicaZ mote using SOS environment. SOS is an OS formote-class wireless sensor networks developed by the Net-worked and Embedded Systems Laboratory, UCLA [9]. Finally,at the base station laptop, a Java program was used to monitorand log the light measurements, and a visualization interfacewas used for real-time debugging and analysis. We developed agraphical user interface (GUI) visualization interface as shownin Fig. 5 to display the status of the Illumimote in real time,which was used for testing, experimenting, and performingdemonstrations. The interface was implemented in Java andProcessing [10]. This GUI makes it easy to test and evaluatethe Illumimotes visually and is a step towards designing theinterface that could be used by a cinematographer in the future.

B. Performance Results

First, the light-angle sensors were evaluated by placing atungsten-balanced incandescent lamp at all combinations ofthree angles (0 , 30 , and 60 ) and three distances (nine totalpoints). The dual photodiodes (Hamamatsu S6560) on theIllumimote have the same shape and are symmetrically placedwith respect to vertical barricade in between them. So, theangle estimation performance would be symmetric with respectto X and Y axes. Therefore, only the first quadrant was tested,as performance for the other three quadrants is similar. Wemeasured and estimated the angle ten times for each point.The light-angle estimation results were well correlated with anaverage error of about 3 . This experimentation for the angleestimation was done with the prototype Illumimote board.

The fabricated Illumimote contains an experimental designfor the two diodes composing each channel of the light angle

5http://konicaminolta.com.

Page 6: Illumimote: Multimodal and High-Fidelity Light Sensor Module for Wireless Sensor Networks

PARK et al.: ILLUMIMOTE: LIGHT SENSOR MODULE FOR WIRELESS SENSOR NETWORKS 1001

Fig. 5. Screen shot of the real-time visualization interface.

sensor. The sensor is packaged in a common-cathode configu-ration, and it was hoped that by employing the design of Fig. 2,and thereby sharing the bias currents, we could accentuatethe angle-to-voltage function and improve the resolution nearapogee. Unfortunately, this approach proved highly unstable,with the input tending to saturate in favor of one of the diodesas the source approached apogee. An alternative voltage-basedprototype was developed separately for this sensor. We useddata from the prototype for evaluation of the light-angle ex-periments described in this paper. We will incorporate thisarchitecture into the next revision of the Illumimote.

With the experimental setup shown in Fig. 4, we evaluatedmeasurement accuracy and the dynamic range capability of inci-dent light intensity sensor. With 11 measurement points and twobrightness settings, we collected 41 318 light intensity measure-ments in total from the Illumimote and applied moving averagefilter with window size 20 to reduce random measurement noise.Then, calibration coefficients and for each sensitivity settingwere calculated with the first half of the raw intensity measure-ment data. Once the calibration coefficients are obtained, anylight intensity measurements (ADC values) can be convertedinto lux values by the methods in Section IV.

To increase dynamic range and sensing resolution, the Illumi-mote offers a six-step sensitivity control: 1 K , 667 K , 1 M ,1.7 , 2 M , and 10 M settings. We evaluated four represen-tative sensitivity settings among them with 1 K , 1 M , 2 M ,and 10 M bias resistances. Fig. 6 shows the ADC output withrespect to light intensity in lux for these four sensitivity settings.As shown in Fig. 6, the output voltage of the light acquisitionunit is linearly proportional to the light intensities, and the ADCdata revealed the sensitivity control unit (SCU) to have an ef-fective and desirable response on the output. The light acqui-sition unit was confirmed to feature a rail-to-rail output range(0–1023 of 10-bit ADC output) that operates from a stable 5 Vreference regardless of the mote’s operating voltage and batterystate (assuming sufficient current drive battery life remaining).Figs. 7 and 8 confirm that adjacent sensitivity settings have over-

Fig. 6. ADC output of four sensing sensitivity settings.

Fig. 7. Light intensity measurement for bright light setting.

lapping ranges to ensure reliable measurements across the tran-sition points between SCU regions and to serve as margin in theimplementation of hysteresis functions inside the SCU’s controlsoftware.

Page 7: Illumimote: Multimodal and High-Fidelity Light Sensor Module for Wireless Sensor Networks

1002 IEEE SENSORS JOURNAL, VOL. 7, NO. 7, JULY 2007

Fig. 8. Light intensity measurement for low light setting.

Although we verified that the current fabricated Illumimotecan measure up to 18 500 lux brightness [8], this experimenta-tion focused more on a practical light intensity region less thanor equal to 3000 lux. Illuminance by sunlight at the exterior isabout from 32 000 to 100 000 lux, but illuminance for many filmsets—especially indoor—is less than that. For example, bright-ness of the moonlight is about 1 lux, illuminance for a brightoffice is about 400 to 500 lux, and normal TV studios are lit atabout 1000 to 2000 lux.6

Figs. 7 and 8 show the results from the two brightness set-tings. In the figures, the X axis represents the distance fromthe light source and the Y axis represents light intensity mea-surement in lux by our Illumimote. The tables below the figuresshow the average light intensity value and average of absoluteerror by the Illumimote for each point. Measurement regionsfor each sensitivity setting were determined by the ADC outputshown in Fig. 6. For example, each sensitivity region can takethe region that corresponds from 25 to 1000 of ADC outputvalue. Measurement regions of our sensitivity settings are thefollowing: the 10 M setting covers from 0 to 160 lux, the 2 Msetting covers from 110 to 700 lux, the 1 M setting covers from450 to 1300 lux, and the 1 K setting covers more than 700 luxregions.

As shown in the figures, our Illumimote measurements werecorrelated to the Sekonic light meter readings within about 5%for most of points except very low intensities ( 8 lux). We ana-lyzed the reasons from the following. Experimental design wascomplicated by a number of factors: the exact same measure-ment position for the Illumimote and the Sekonic meter, angleof the lumisphere, hand posture when measuring light intensi-ties with Sekonic meter (because of manual measurements), andlight reflections from equipments in the environment. Measure-ments at very low intensities ( 8 lux) appeared to be exac-erbated by the complicated experimental design problems andhave decreased the accuracy to more than 5% on average.

To evaluate color intensity sensors and our color temperaturecalibration methods, we collected 13 647 color intensity mea-surements from RGB sensor channels for light settings with dif-ferent color temperatures ranging from 2900 to 6560 K. Fig. 9shows comparison results to the ground truth color temperaturevalue by Minolta Color Meter IIIf Our color temperature calcu-lation results by two calibration settings (3110 and 5660 K) areshown. For both of the cases, our color temperature calibration

6See http://en.wikipedia.org/wiki/Lux.

Fig. 9. Color temperature calculation results.

methods correlated with the Minolta color meter within 5% onaverage. One of the key factors to improve this subsystem is torefine the model of the sensor’s spectral response, especially ifany protective cover is to be installed.

Throughout the experiments, we verified that the currentwireless light-sensing system with our Illumimote can collectlight intensities at the speed of 340 measurements per second,of which time for one measurement corresponds to about 3 ms.Because one television video frame is captured over approxi-mately 33.37 ms [11], the Illumimote (at 3 ms) is responsiveenough to detect and process lighting changes that would im-pact a single video or film frame in real-time. Regarding powerconsumption of the sensor module, the Illumimote consumesapproximately 90 mW when all sensor channels are turned on.

VI. CONCLUSIONS

Our new light-sensing module, the Illumimote, for the Micamote platforms achieves performance comparable to a commer-cial light meter and color meter (as used by professional cine-matographers) over the ranges indicated in our findings. It con-sists of incident light intensity, RGB intensity (for color tem-perature calculation capability), and incident light angle sensorsas well as thermal and attitudinal sensors. We characterized itsperformance and verified its capabilities. The project Web sitehosts the technical data7 and the Illumimote will soon be com-mercially available from Atla Labs8 to allow other researchersaccess to the technology for their own experimentation. Our fu-ture work includes further enhancements to the general charac-teristics of the Illumimote (such as dynamic range), estimationof the vertical incident light angle, and further development ofthe software tools that support and integrate the Illumimote insupport of its deployment on actual productions scheduled forthe near future.

ACKNOWLEDGMENT

H. Park would like to express his appreciation to SamsungElectronics for their support.

7http://nesl.ee.ucla.edu/research/illumimote.8http://www.atlalabs.com.

Page 8: Illumimote: Multimodal and High-Fidelity Light Sensor Module for Wireless Sensor Networks

PARK et al.: ILLUMIMOTE: LIGHT SENSOR MODULE FOR WIRELESS SENSOR NETWORKS 1003

REFERENCES

[1] G. J. Pottie and W. J. Kaiser, “Wireless integrated network sensors,”Commun. ACM, vol. 43, no. 5, pp. 51–58, 2000.

[2] J. Burke, J. Friedman, E. Mendelowitz, H. Park, and M. B. Srivastava,“Embedding expression: Pervasive computing architecture for art andentertainment,” J. Pervasive Mobile Comput., pp. 1–36, Feb. 2006.

[3] M. Amundson, J. Friedman, V. Holtgrewe, and H. Park, “UCLA engi-neers collaborate on unique sensor system for film production,” UCLAEng. News Center, Mar. 2005.

[4] N. M. Su, H. Park, E. Bostrom, J. Burke, M. B. Srivastava, and D.Estrin, “Augmenting film and video footage with sensor data,” in Proc.2nd IEEE Int. Conf. Pervasive Comput. Commun. (PerCom), Mar.2004.

[5] V. Singhvi, A. Krause, C. Guestrin, H. Garrett, and H. S. Matthews,“Intelligent light control using sensor networks,” in Proc. 3rd Int. Conf.Embedded Networked Sensor Syst., New York, 2005, pp. 218–229.

[6] F. Wagmister, B. McDonald, J. Brush, J. Burke, and T. Denove, Ad-vanced Technology for Cinematography. Los Angeles, CA: UCLA,2002 [Online]. Available: URL:http://hypermedia.ucla.edu/ projects/atc.php

[7] G. Wyszecki and W. S. Stiles, Color Science: Concepts and Methods,Quantitative Data and Formulae, 2nd ed. New York: Wiley, 1982.

[8] H. Park, J. Friedman, J. Burke, and M. B. Srivastava, “A new lightsensing module for mica motes,” in Proc. 4th IEEE Conf. Sensors,2005.

[9] C.-C. Han, R. Kumar, R. Shea, E. Kohler, and M. Srivastava, “Adynamic operating system for sensor nodes,” in Proc. 3rd Int. Conf.Mobile Syst., Applicat., Services (MobiSys ’05), New York, 2005, pp.163–176.

[10] B. Fry and C. Reas, “Processing,” [Online]. Available: http://www.pro-cessing.org/.

[11] P. Rees, “SMPTE EBU timecode,” [Online]. Available: http://www.philrees.co.uk/articles/timecode.htm.

Heemin Park (S’07) received the B.S. and M.S. de-grees in computer science from Sogang University,Seoul, Korea, in 1993 and 1995, respectively. He iscurrently pursuing the Ph.D. degree at the Universityof California, Los Angeles (UCLA).

His master’s research focused on very large-scaleintegration computer-aided design. He was withSamsung Electronics, Co., Ltd., Korea, with aspecialty of design for testability. He is a GraduateStudent Researcher in electrical engineering at theNetworked and Embedded Systems Laboratory,

UCLA. His research interests include design of networked and embeddedcomputing systems, wireless sensor networks, entertainment computing, andubiquitous computing.

Jonathan Friedman (S’03) has spent most of hisprofessional career split between IT/MIS admin-istrative duties and mixed-signal PCB design. Hewas the Director of Database Support Services forSonic Associates (an IMAX company), Directorof US Technological Cooperation Students atthe Chernigov State Institute for Economics andManagement (Chernigov, Ukraine, 2002), and isthe Founder of HalcyonIT, an IT oursourcing firmfor many small-to-medium size businesses. In hisresearch at the Networked and Embedded Systems

Laboratory, University of California, Los Angeles (UCLA), he is interestedin improving the physical sensor layer of wireless mobile embedded sensornetworks through more advanced implementations and designs. Specifically, heis targeting the problem of location of a sensed entity by implementing a newarchitecture for robust (noise-immune), low-latency (many positional fixes persecond), high-accuracy localization for mobile nodes. Additional interests liein relaxing the deployment constraints and requirements for static (nonmobile)beacons. Application areas of interest lie in entertainment. In collaborationwith UCLA’s School of Theater, Film, and Television, his work is being shapedand adapted to fit within the unique constraints of this application space.

Pablo Gutierrez graduated in electrical engineeringfrom the Catholic University of Chile (PUC) in 1991.

He is currently with Research in Engineering,Media and Performance, School of Theater Film andTelevision, University of California, Los Angeles.He has broad experience in field engineering, such asintegrating motion control systems for visual effectsin Los Angeles, servo systems for the telescopes ofthe European Southern Observatory (ESO-Paranal),defense monitoring and communication systemsfor the Chilean navy, digital aerial photography

for the mining business, and as an Operator for the oil logging services ofSchlumberger. His current interest is to integrate network sensors and actuatorsfor the modern film production workflow for better efficiency and to discovernew creative ways.

Vidyut Samanta received the bachelor’s degree inComputer Science from Purdue University, WestLafayette, IN, in 2002 and the M.Sc. degree fromthe University of California, Los Angeles (UCLA),in 2005.

During his years at Purdue, he was a ResearchAssistant with the Secure Software Systems Lab andworked on projects involving compiler constructionand programming languages. While studying atUCLA, he worked on projects in the wireless andmobile networking field.

Jeff Burke is Executive Director of the Center forResearch in Engineering, Media and Performance,University of California, Los Angeles (UCLA). Hehas coauthored, designed, coded, or produced per-formances and new genre installations exhibited ineight countries, coordinating diverse teams spanningthe arts and engineering. He has taught in the Schoolof Theater, Film and Television, UCLA, as well asin the graduate industrial design program at the ArtCenter College of Design.

Mani Srivastava (S’87–M’90–SM’02) received thePh.D. degree in electrical engineering and computerscience from the University of California, Berkeley,in 1992.

Currently, he is a Professor in the ElectricalEngineering Department, University of California,Los Angeles (UCLA). He is also associated withUCLA’s Center for Embedded Networked Sensing(CENS), a National Science Foundation Science andTechnology Center. Prior to joining UCLA, he waswith Bell Labs Research. His current interests are

in embedded sensor and actuator networks, wireless and mobile systems, em-bedded systems, power-aware computing and communications, and pervasivecomputing.

Prof. Srivastava is a member of the IEEE Computer Society.