themes and trends in space science data processing and ... · however, there are common themes and...

11
JOHNS HOPKINS APL TECHNICAL DIGEST, VOLUME 20, NUMBER 4 (1999) 533 D Themes and Trends in Space Science Data Processing and Visualization Stuart R. Nylund and Douglas B. Holland ata processing and visualization in the APL Space Department have been developed as tools to support scientific research. Every space mission has required the basic tasks of spacecraft telemetry data access, decoding, conversion, storage, reduction, visualization, and analysis. The research goals of each mission are different, requiring customized development and support of these standard tasks that draw upon our heritage of experience. Along with these changes in research goals, there have been revolutionary developments in the computer software and hardware environment. However, there are common themes and trends among all of these changes, which provide perspective on past work and insights into future development. (Keywords: Data processing, Space Department history, Space science data, Visualization.) INTRODUCTION With the start of operations, each new mission enters the phase of accumulating the data necessary for achieving its science objectives. These data as received on the ground, however, often bear little resemblance to the original sensor measurements needed to support study and analysis. They have been fragmented through buffering, intermixing, and encoding in engineering shorthand to fit the most information into the limited telemetry bandwidth. Noise and data gaps may further compound these changes. Before even the most basic research can start, the processing step begins: the measurements are ex- tracted and decoded using the detailed telemetry map provided by the instrument engineer. Data are orga- nized and stored so as to allow easy access. Through the application of calibration information, sensor influenc- es on measurements are removed to produce data of the actual environmental conditions. Now ready for anal- ysis, the data are transformed and condensed optimally to address the research questions. The final derived data are often best presented graphically to efficiently reveal complex levels of detail. With the production of visu- alizations, the data are finally returned to the science process. In the Software and Data Systems Group of the Space Department, we design data processing and visuali- zation systems for a variety of space science missions (Fig. 1). These systems perform the operations to restore, archive, and access the data for scientific analysis. The design must be developed within cost, schedule, and

Upload: others

Post on 24-Mar-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

J

SPACE SCIENCE DATA PROCESSING AND VISUALIZATION

Ta

S

D

hemes and Trends in Space Science Data Processingnd Visualization

tuart R. Nylund and Douglas B. Holland

ata processing and visualization in the APL Space Department have beendeveloped as tools to support scientific research. Every space mission has required thebasic tasks of spacecraft telemetry data access, decoding, conversion, storage, reduction,visualization, and analysis. The research goals of each mission are different, requiringcustomized development and support of these standard tasks that draw upon ourheritage of experience. Along with these changes in research goals, there have beenrevolutionary developments in the computer software and hardware environment.However, there are common themes and trends among all of these changes, whichprovide perspective on past work and insights into future development. (Keywords:Data processing, Space Department history, Space science data, Visualization.)

INTRODUCTIONWith the start of operations, each new mission

enters the phase of accumulating the data necessary forachieving its science objectives. These data as receivedon the ground, however, often bear little resemblanceto the original sensor measurements needed to supportstudy and analysis. They have been fragmented throughbuffering, intermixing, and encoding in engineeringshorthand to fit the most information into the limitedtelemetry bandwidth. Noise and data gaps may furthercompound these changes.

Before even the most basic research can start,the processing step begins: the measurements are ex-tracted and decoded using the detailed telemetry mapprovided by the instrument engineer. Data are orga-nized and stored so as to allow easy access. Through the

OHNS HOPKINS APL TECHNICAL DIGEST, VOLUME 20, NUMBER 4

application of calibration information, sensor influenc-es on measurements are removed to produce data of theactual environmental conditions. Now ready for anal-ysis, the data are transformed and condensed optimallyto address the research questions. The final derived dataare often best presented graphically to efficiently revealcomplex levels of detail. With the production of visu-alizations, the data are finally returned to the scienceprocess.

In the Software and Data Systems Group of the SpaceDepartment, we design data processing and visuali-zation systems for a variety of space science missions(Fig. 1). These systems perform the operations to restore,archive, and access the data for scientific analysis. Thedesign must be developed within cost, schedule, and

(1999) 533

S. R. NYLUND AND D. B. HOLLAND

Figure 1. Examples of visualizations for Space Department missions. (a) Polar BEAR Ultraviolet Imager auroral image, (b) SuperDARNradar real-time ionospheric convection, (c) Defense Meteorological Satellite Program F7 ion differential energy flux, (d) Energetic NeutralAtom (ENA) imaging model of Earth’s ring current, (e) Galileo Energetic Particles Detector electron intensity anisotropy, and (f) Geosatradar altimeter sea surface waveheight.

4.924.764.604.444.284.123.963.803.643.483.323.163.00

Log

rate

(co

unts

/s)

31

2519

137

1Spin

secto

r

1738

175317

50174717

44

Universal time

1741

(a) (b)

(c) (d)

(e) (f)

534 JOHNS HOPKINS APL TECHNICAL DIGEST, VOLUME 20, NUMBER 4 (1999)

CE DATA PROCESSING AND VISUALIZATION

SPACE SCIEN

1960 1970 1980 1990 2000101

102

103

104

105

106

Inst

rum

ent d

ata

rate

s (b

its/s

)

✚ Magnetometer instruments✚ Particle instruments✚ Imaging instruments

✚ MSX(intermittent)

Polar BEAR

TIMED✚

✚ Hilat✚ Hilat

GeotailMSX

✚ ACE✚ Galileo✚ ISEE

✚ Voyager✚ Injun

✚ TRAAC

✚ IMP

✚ Triad

✚ AMPTE/CCE

Viking✚✚

✚ ✚

✚ Freja

Year

Figure 2. To derive the most scientific return for a mission, the data rate for an instrumentis set as high as possible. The figure shows the upward trend in sustained data rates forparticle, magnetic field, and imaging instruments permitted by increases in telemetrydownlink capacity and instrument sampling and processing power. Onboard processingalso allows the data rate to be more efficiently used through the application of datacompression algorithms and onboard data reduction and analysis. (NEAR = Near EarthAsteroid Rendezvous, ACE = Advanced Composition Explorer, ISEE = International SunEarth Explorer, IMP = Interplanetary Monitoring Platform, AMPTE/CCE = Active Magneto-spheric Particle Tracer Explorer/Charge Composition Explorer, TRAAC = Transit Re-search and Attitude Control, TIMED = Thermosphere-Ionosphere-Mesosphere Energeticsand Dynamics, MSX = Midcourse Space Experiment.)

performance constraints. Whereas the general proce-dures of data processing and visualization are repeated foreach mission, each application is unique owing to theinfluences of data, science objectives, and the currentcomputer technology. Because of the research nature ofthese missions, we work closely with the scientists toensure that the design meets their needs. The final designdraws on our background in space science data process-ing and visualization systems to make the best trade-offsto maximize the science results.

Changes in Mission Data ProcessingThe patterns seen in present data processing work

are evident in the APL space missions of the 1960s.Although lacking the complexity of later missions,they illustrate all aspects of data processing and visu-alization.1 The data processing and visualization ofearlier time involved analog and digital signal process-ing tasks. The whole scope of the task was likely tobe performed by the researcher with a minimum ofautomated tools. Data were often processed manuallyto remove bad data and then were plotted manuallyby a graphic artist.

JOHNS HOPKINS APL TECHNICAL DIGEST, VOLUME 20, NUMBER 4 (1999)

Whereas these higher-leveltasks remained the same in latermissions, scientific understandingmatured with successive missionsand general measurements becamemore focused and detailed. Forexample, simple isotropic measure-ments of relative particle countrates were expanded to absoluteflux resolved in three-dimensionalspace as well as differentiatedamong particle species and at timeseven among specific charge statesand isotopes. In addition to in-creased measurement complexity,the volume of measurements hasalso increased with the capabilitiesof the spacecraft and bandwidth ofthe telemetry systems (Fig. 2).Fortunately, the capabilities ofground networks, computers, andstorage facilities have also in-creased, fueled by the same techno-logical advances.

Computer science has matured,with advanced software develop-ment environments providingmore rapid and complex methodsfor both data processing and visu-alization.2 The roles of the devel-oper and the researcher havecontinued to change and overlap.The varieties of data processing

for APL missions and the types of instruments andresearch fields have continued to increase into manynew areas.

Trade-OffsA mission, which originates as a scientific enter-

prise, becomes an engineering effort. Trade-offs sim-ilar to those of spacecraft engineering are made indevelopment of the data processing and visualizationsystem. Ease of use is often in opposition to rapiddevelopment. The need to preprocess the data tospeed subsequent access must be balanced by the needto keep the data as unprocessed as possible to retainthe maximum information. The complexity and vol-ume of data in the final display can slow production.The needs of the current mission must be balancedagainst the recognition that the data are a significantnational resource3 that must be preserved for thefuture. Automation, which can repeatedly create stan-dard results, must be traded with manual control,which can produce a limited number of individualizedresults.

535

S. R. NYLUND AND D. B. HOLLAND

DATA PROCESSING AND ARCHIVING

Basic Data Processing OperationsData processing and archiving are the operations

required to make data available for research. Of the sixstages shown in Fig. 3, three are basic processing op-erations. The first data operation is reconstruction,which re-creates the original instrument measurementsfrom telemetry. It requires undoing the effects of trans-mission, which may include buffering, commutation,multiplexing, packetization, encoding, and compres-sion, and results in a data volume similar to that orig-inally created at the instrument. The reconstruction insome cases will include time ordering and missing datahandling. The data as received are ordered by require-ments of the transmission system; for example, all of thedata from each downlink are together. Reconstructionoperations often include a reorganization of the data bymission event, like orbit number, or by time (e.g., byyear and day). This first level of cleaned reconstructeddata is often saved as a raw data archive because itcontains all of the instrument measurements that willever be available to the researcher.

The second basic operation is data conversion,which corrects instrumental effects on measurements toreconstruct original environmental conditions by ac-counting for sensor efficiency, view aspect, flight pro-cessing, and sensor noise. The results are measurementsin engineering units that are held as floating pointnumbers, a format that allows a large dynamic range ofvalues. At this stage, the data are ready for the projectengineers to use and are a starting point for researchers.

Figure 3. Process flow diagram. The volume of data varies with the stages of the mission.Whereas the relative volume differs from mission to mission, similar contractions andexpansions in the data occur at each stage along the data flow.

Data processing and archiving Visualization

CPU

Data Flow

Rel

ativ

e da

ta v

olum

e

Spacecraft

Sensor

Instrumentmeasurements

Datareconstruction

Dataconversion

Datareduction

VisualizationTelemetry

Mission stage

536 JO

The third basic operation in processing is data re-duction. This most often requires operations for select-ing just part of a data set and decreasing the temporaland/or spatial volume, usually by averaging operations.A further analysis step is often the removal of noise andbackgrounds. Beyond these common steps, the process-ing development varies with each instrument, eachmission, and differing research objectives.

ArchivingData archives serve the researcher in several ways.

They can speed access to the data, create a standardform to facilitate further analysis and visualization op-erations, and preserve the data for future use. Processeddata archives can be tailored to the research need, withstandard operations already performed and unneededdata removed. These advantages may become disad-vantages if the processed data omit an important partof the raw data or if information is lost in the conver-sion and reduction processes, making it necessary to goback to the raw data. For this reason, one of our goalsin archive creation is to keep it at as low a level as isconsistent with access requirements.

Another value-added step to archiving can be thecreation of indexes and key parameters extracted tofacilitate data browsing. This step is especially criticalin missions with extremely large volumes of image data.The MSX (Midcourse Space Experiment) mission,with imagers such as UVISI (Ultraviolet and VisibleImagers and Spectrographic Imagers), which had anintermittent data rate of nearly 106 bits/s, defineda standard set of parameters that were created to

HNS HOPKINS APL TECHN

summarize image characteristicsand thereby speed selection of im-ages of interest.

The archives of older data setsmust be preserved, as they arerecognized as a timeless resourcecontaining the only historicalmeasurements in a field. Designsfor this long-term preservation arebecoming a standard part of newermissions. With the increased use ofnetworks to support research, ar-chives are increasingly on-linefacilities. Many of the missions pre-sented on the Space DepartmentPrograms Web site (http://sd-www.jhuapl.edu/programs.html) have orare preparing such access.

Trends in Data ProcessingWhen we design a data process-

ing system to meet the require-ments of a new mission, many

ICAL DIGEST, VOLUME 20, NUMBER 4 (1999)

SPACE SCIENCE DATA PROCESSING AND VISUALIZATION

factors are balanced, but the primary ones are sciencerequirements, cost, and schedule. These basic, drivingforces are further divided into requirements of staffingand computer hardware and software development.These factors are traded to find the best mix for eachmission. Early missions relied heavily on staffing be-cause of hardware and software limitations. Newersystems such as the NEAR (Near Earth Asteroid Ren-dezvous) Science Data Center rely much more heavilyon hardware and software automation to perform tasks.

Delays between measurement and availability dependon the balances in the design. Early missions had lowdata volumes and little delay. As the spacecraft systemsevolved, higher mission volumes and the need for real-time access drove the development of faster processingsystems. The trend for higher data volumes acceleratedas imaging instruments became part of missions.

Standardization

It is critical for the processing system to support thescience objectives of the mission. Mission requirementsdefine the data characteristics, timing, volume, pack-etization, and delay. In early missions, system designersworked very closely with the research team. These sys-tems were as individual as the mission. More recentsystems developed by our group have begunto include standardized processing components.This trend has been accelerated by standards suchas those from the Consultative Committee for SpaceData Systems on data transmission in the space sciencecommunity. The NEAR, TIMED (Thermosphere-Ionosphere-Mesosphere Energetics and Dynamics),and FUSE (Far Ultraviolet Spectroscopic Explorer)missions share design components of the telemetry dis-tribution process. These missions have taken advantageof the efficiency of having a single data center performsome of the common functions for distributed scienceteams.

This concept of a central data center was ad-vanced in the AMPTE/CCE (Active MagnetosphericParticle Tracer Explorer/Charge Composition Explor-er) Science Data Center, designed and operated byAPL, which provided mission researchers with nearlyinstant access to processed real-time data from allinstruments. This dedicated facility also served as thedata archive and analysis center for all subsequentstudies.4 Data centers have continued to evolve withthe design of later facilities for the MSX, NEAR, andTIMED missions. The recent trend for “faster, cheap-er, and better” missions will promote further efforts tostandardize and thereby reuse parts of the data pro-cessing systems.

Development standards are also evolving for instru-ment science centers. Examples are the centers foranalysis of data from Voyager LECP (Low Energy

JOHNS HOPKINS APL TECHNICAL DIGEST, VOLUME 20, NUMBER 4 (199

Charged Particle), Galileo EPD (Energetic ParticleDetector), and Geotail EPIC (Energetic Particle andIon Composition) instruments. These centers use theheritage of specialized knowledge required for process-ing data of a particular instrument. The standardizationof processing systems is creating the need for specialistsin areas of data archiving, databases, space science dataanalysis, and network data distribution.

Archives are also becoming more standardized. TheNASA Planetary Data System and Earth ObservingSystem both have standards for data distribution to awide community of users and plans for long-term dataset preservation. Early development work is being doneto establish standards for instrument data representa-tions. The goal is to find a lowest common denomina-tor form of the data that will allow similar instrumentsfrom many missions to be used in common studies,making use of standard data interchange, processing,and visualization tools.

Software Development

The development of larger processing systems andthe increased need for them to support many users haveincreased the need for quality in the software develop-ment process. The Space Department is continuing thedevelopment of a quality assurance process. The chal-lenge is to tailor the process for our environment andfor the requirements of each program.

This balancing of requirements has led us in manyunexpected directions during the history of the Depart-ment. In early missions, the researchers themselvesprocessed data. Processing requirements and computercapabilities led later missions to move toward centralprocessing facilities, which often caused delays. Thistrend evolved to the use of minicomputers in an effortto reduce delays but at the initial sacrifice of raw pro-cessing power; this problem was then alleviated byaround-the-clock processing. Now, PC cost decreasesand power increases are leading recent missions backto desktop processing (Fig. 4).

In the software design area, early missions focusedon assembly language development. As processingpower increased, higher-level, more algorithmic lan-guages like Fortran became the standard. The newestmissions are using higher-level languages such as IDL,C++, and Java, which allow the developer to moreeasily perform a task that could take much longer inlower-level languages. Such high-level languagesoffer specialized support to data analysis and visualiza-tion, graphical user interfaces, network communica-tions, and network information sharing paradigms.They may also provide built-in support for softwarequality development methods. Most importantly, thesetools are bringing software development back closerto the researchers and encouraging interactivedevelopment.

9) 537

S. R. NYLUND AND D. B. HOLLAND

1960 1970 1980 1990 200010–1

100

101

102

103

104

Sto

rage

cap

acity

(G

B)

� CPU� On-line hard disks� Off-line storage media

✚ MSX(intermittent)

Polar BEAR

TIMED✚

✚ Hilat✚ Hilat

GeotailMSX

✚ ACE✚ Galileo✚ ISEE

✚ Voyager✚ Injun

✚ TRAAC

✚ IMP

✚ Triad

✚ AMPTE/CCE

Viking✚✚

✚ ✚

✚ Freja

HP 9000-782 �

HP 9000-770 �

Year

� HP 9000-730

10–2

10–1

100

101

102

103

CP

U s

peed

(M

flops

)

� SGI Indigo

� VAXStation 4000

� DLT

Tape

��

� 8-mm tape

� CD-ROM

� WORM optical

� 9-track 6250-bpi tape

� IBM 360/91

� IBM 7094

� Univac 1103

� PDP-11/34

��

� VAX-11/780

VAXStation 3200

8-mm tape

PDP-11/60

Figure 4. Against the backdrop of rising instrument data rates from Fig. 2, it can be seenthat the computing resources available within the Space Department have also increased.In the 1970s, minicomputers began to be used more than mainframe systems as a way toincrease user interactivity with the data and reduce processing turnaround. The resultingloss in processing capability relative to data rates was temporary. (DLT = digital linear tape,WORM = write once/read many.)

VISUALIZATIONOnce processed and archived, the mission data are

ready for the researchers to use. Whether used directlyin archival form or reduced by additional processing,there must be a way to examine the data. Textual andtabular presentations of data are inadequate for anal-ysis and interpretation of large data sets.5 Visualiza-tion, the presentation of data as graphical imagesor abstract depictions, gives researchers far betteraccess to quantitative information by leveragingtheir perceptual capabilities.6 Graphical displays at-tempt to convey large amounts of data in a compactformat that simultaneously presents both an overviewand details.7

We develop software applications that provide re-searchers with these views into the archives of data. Thedesign of these applications results from the influencesof scientific needs, the data, and computer technology.Our software specialists, working closely with the scien-tists to understand their needs as well as the complexitiesof the data, select display formats such as line plots,images, and spectrograms for each type of data and ar-range them with supporting information into page lay-outs. We are responsible for deciding the languages,formats, and methods by which the software will interactwith the user, access and manipulate the data, and con-struct these layouts.

538 JOHNS HOPKINS APL TECHN

Although we have no single de-sign consensus because of thediversity of scientific needs anddata influences, there are common-alities in software design. Becauseour data archives often containuncalibrated data, visualizationmust include elements of data pro-cessing software to convert andreduce data for display. At present,our designs maintain a distinct di-vision between data processing andvisualization software. By treatingthese two operations as discretemodules of function, the visualiza-tion function is more standard andreusable. The separation also makesthe visualization design more fo-cused and less complex.

Influence of Science ObjectivesVisualization supports scientific

research of data in three general en-deavors: surveying large archives ofdata, studying localized portions oraspects of data, and illustrating dis-coveries for presentation and publi-cation. Each task requires data

processing to retrieve items from the archive and tooptionally convert and reduce these data by temporaland spatial averaging before they are visualized. Thesesimilarities have encouraged the design of a single visu-alization tool for a mission to support all survey and, tosuccessively lesser degrees, study and illustration efforts.

Advantages of common development and reducedmaintenance costs in such a single design need tooutweigh design complexity and possible performancereduction. The complexity of a visualization tool couldincrease unacceptably if it were required to provide allpossible display formats of research study as well asillustrations. The alternative for these more demandingbut less frequent cases is to rely on other visualizationmethods. Our common approach is to use a commercialgraphics application on a PC (e.g., MATLAB, Math-ematica, Igor, Noesys, and even Excel) to create cus-tomized displays. The user iteratively constructs adisplay layout from input data (usually columns of text).Although useful for this task, these applications lackthe robust bulk processing capability that the visualiza-tion tool provides for survey work.

For publication-quality visualizations, the graphicartist still serves an important role in enhancing com-puter-generated displays or composing figures that arebeyond the ability of either of the previous methods.In the early missions, this was the only method forproviding illustrations for scientific journals; however,

ICAL DIGEST, VOLUME 20, NUMBER 4 (1999)

the quality of computer visualiza-tions has improved dramatically.

Beyond creating views that thescientists need, a visualization sys-tem must be intuitive to operateand sufficiently quick in producingresults. Tedious or confusing inter-faces and long processing delaysonly distract or frustrate the re-searcher’s focus on science. Com-mercial programs with pull-downmenus have done much to simplifyhow users can enter input. Theseinterfaces are now often used in ourdesigns, such as that shown in Fig.5, to guide users to lists of optionsrather than requiring them to beremembered. Alternately, visualiza-tions can be used as a key to selectfurther visualizations. When theGalileo orbital path, shown inFig. 6, is clicked on at any pointalong its track, detailed data aredisplayed for that period.

Researchers are regaining a direct involvement invisualization that was once common in early missionswhen data volumes were low. Although far from true forall, some researchers have been limited to interactionwith data through programmers. This trend is reversingin the Space Department. The desktop PC is the central

Figure 5. An exampcontrol display optionup windows containin

JOHNS HOPKINS APL TECHNICAL DIGEST, V

SPACE SCIENCE DATA PROCESSING AND VISUALIZATION

le of a multiple-window widget interface for a visualization tool. Tos, the user interacts with a base window and is presented with pop-g default choices that the user may change.

OLUME 20, NUMBER 4 (1

location for the researcher’s activities of e-mail corre-spondence, word processing, and information retrievalvia the World Wide Web. With graphics applications,telnet connections, and ftp transfers, personal computershave become functional platforms for analysis in whichour visualization designs must also be able to operate.

(a) (b)

Figure 6. A touch-sensitive orbital plot display (a) provides access to Galileo EPD energy spectrograms (b).

999) 539

S. R. NYLUND AND D. B. HOLLAND

Influence of DataSpace Department missions have resulted in mea-

surements of charged particles, magnetic fields, cosmicrays, and the electromagnetic spectrum (from radarthrough infrared, visible, and ultraviolet frequencies togamma rays) as evidenced in Fig. 1. These measure-ments can be grouped as either remote or in situ. Re-mote measurements are of distant conditions thatpropagate to the sensor, such as starlight measured byphotometers, whereas in situ measurements are of con-ditions directly encountered by the sensor, such as thefield measured by an orbiting magnetometer. For visu-alization, the distinction is significant because the dis-play construction and formats differ. Remote sensingdata are rendered most usually as an image, meaning alikeness of the actual object, while in situ data arerendered abstractly, e.g., a line plot.

Creating an image display is more than assemblingpicture elements, or pixels, in rows and columns ona screen. Each pixel represents a data measurement bya sensor element that must be adjusted for size in thefinal display using supplemental information. As seenin Fig. 1a, the pixels from the orbiting Polar BEARUltraviolet Imager are more finely grained in thecenter of the image than at the edges where the Earthcurves from view. Such tuning is derived from adetailed understanding of the attitude of the space-craft and look direction of the sensor. To enhance theunderstanding of the data, geographic coastlines andmap gridlines may be added, when appropriate, by adetailed reconstruction of spacecraft position thatis combined with the attitude and look-directioninformation.

For in situ data, a display format must be chosenthat is most appropriate for either illustrating the dataor enhancing data features. In Fig. 7, several energy

540 JO

channels of particle data are presented in two differentdisplay formats. The upper line plot better conveysabsolute data values while the lower spectrogram givesa better overall relational view of the channels. A wirediagram as illustrated in Fig. 1e conveys still differentmeaning. In general, increases in data complexity,defined as more distinct but related information, haveresulted in the reliance on more densely populatedplots. The use of color is a standard method to addressthis demand.

As much as any factor, the growth of data rates andhence the volume of the data archives have driven theneed for automating visualizations. Where it was oncepossible to plot data by hand with reasonable precisionfrom tables of numbers for the missions of the 1960s, thevolume of data to be shown made it necessary to programplotting devices to render line plots in the 1970s and,by the 1980s, color displays.2 When individual measure-ments became too numerous to follow, displays of time-averaged data for all or representative data items weredesigned for researchers to inspect conveniently. Surveyor standard data plots, which cover a fixed time intervalsuch as an orbit or a day, have become a routine adjunctof data processing. The number of different types of thesesurvey plots depends in part on the complexity of theinstrument. The simple Geiger counters of missions suchas Injun I have been replaced by instruments that notonly count ions but distinguish their type, time of arrival,direction of travel, and energy. This increase in measuredcharacteristics compounds the number of possible displaycombinations.

Influence of Computer TechnologyWhereas the growth in size and complexity of

data has presented new challenges to visualization,improvements in computer technology have often

(a)

(b)

Figure 7. Identical data presented in (a) line plot and (b) spectrogram format to illustrate the data presentation strengths of each.

HNS HOPKINS APL TECHNICAL DIGEST, VOLUME 20, NUMBER 4 (1999)

SPACE SCIENCE DATA PROCESSING AND VISUALIZATION

provided the resources for improved designs to meetthese challenges. To illustrate this concept, it is usefulto think of two idealized designs for satisfying research-ers’ views into the archived data. One creates displaysof every possible data combination and stores them forinstant retrieval. The other uses software that lets theresearcher compose and instantly create any possibledisplay. While neither approach is possible, our actualapproach is to use aspects of these two methods: tocreate a limited set of standard plots for survey studiesand to provide a software tool for the researcher tocreate nonstandard displays of data. How well thisdesign works depends largely on its flexibility and howpromptly it performs.

Price and performance improvements in hardwareand software have improved the achievement of thisdesign. With larger and faster disks, more of the dataarchive can be stored on-line and read faster; morepreprocessed standard plots can be cached for quickdisplay. With faster processors, the software runsthrough its tasks more quickly to create the displays andstandard plots. Newer versions of operating systems andprogramming languages can produce similar improve-ments through better efficiency. These returns have theadditional benefit that they require little or no changeto our application software.

Innovations in computer technology increase thevisualization design by providing new methods ofdesign to produce better results. PCs, the Internet andthe Web, MPEG and QuickTime movies, and inexpen-sive color printers are just a few innovations that havegreatly improved what can be done. Two successivemissions illustrate the effects and the pace of innova-tions. In 1984, for the AMPTE/CCE MEPA (Medium-Energy Particle Analyzer) particle instrument,researchers walked to a central terminal room to exe-cute individual graphics programs for each type of colorplot (see Fig. 8a), which were then displayed on ashared TV monitor. Hard copies of these plots weremade with a 35-mm camera, as were the standard dataplots that were distributed for the mission as packets ofslides.

In 1993, for the Geotail EPIC particle instrument,researchers were able to create more detailed plots(Fig. 8b) using a single software tool that, running ona central workstation, could display the plots on theirdesktop PC. Color plots could be printed, and standardplots were distributed as books of plots. Since 1995 theentire set of standard data plots has been available aselectronic GIF-formatted images for Web users. Theentire EPIC data archive is now being made accessiblefrom the Web so that users will also be able to selectand download data of interest as text listings to createtheir own images.

The distinction between workstations and PCs con-tinues to blur. Workstations are being equipped with

JOHNS HOPKINS APL TECHNICAL DIGEST, VOLUME 20, NUMBER 4 (1

fast and inexpensive PC CPUs, while PCs are beinginstalled to run workstation operating systems such asLinux. Placing systems on desktops provided the impor-tant extension of data access to the scientists. Whetherthe desktop system performs this visualization, andeven the underlying data processing, will largely de-pend on the performance and pricing of availablehardware.

(a)

(b)

Figure 8. Examples of summary displays for two particle instru-ments: (a) AMPTE/CCE MEPA from the 1980s and (b) GeotailEPIC from the 1990s. Advances in display and reproductiontechnologies permitted greater detail to be shown.

999) 541

S. R. NYLUND AND D. B. HOLLAND

User access to visualizations is as important astheir creation. Researchers benefited when PCs en-abled them to remain at their desks to view graphicsdisplays. Expanding the access to these visualizationsacross the Internet is equally important. Space scienceresearchers have a long tradition of collaborative ex-change of information among the national and inter-national science community; this tradition was thedriving force for the creation of the Web by particlephysicists at CERN (Conseil Européen pour la Recher-che Nucléaire).

Whereas the Internet and the Web will continue toprovide new capabilities, much can be done to takeadvantage of the current potential. Beyond providingonly displays of refined archival data, the SuperDARNWeb site displays real-time radar images (Fig. 1b) fromremote automated arrays (see the SuperDARN entryunder the Space Department Programs Web site). Forthe Polar BEAR Ultraviolet Imager, MPEG movies(Fig. 9) of substorm intervals have been assembled bysupplementing the data frames with transmorphosedframes (see the Auroral Particles and Imagery entry atthe same site). Views such as these enable researchersto share dynamic images with the community that haveonly been available at their facility and that were neverpossible through publications.

FUTURE CHALLENGESHow can we best serve our users in the future? We

must remain closely involved with the space scienceresearch that we support and informed of the disci-pline’s trends in data processing and visualization. We

Figure 9. MPEG movie presentation of Polar BEAR UltravioletImager substorm data. This format, available on the Web, showsthe dynamics of the auroral imagery.

542 JOH

must also continue to extend our expertise in softwareand hardware systems so as to take advantage of therapid advances in these fields.

Researchers are increasingly focusing on multipleinstruments and on combinations of ground andspace measurements. Standards in data sharing andvisualization will aid these efforts. Reductions in space-craft size make it likely that a future mission will fly acluster or swarm of satellites. Studies of this type willneed processing methods that allow for the effects ofpropagation delays. Practical visualization techniqueswill be needed that can display volumetric data. Cur-rently, data archives are optimized for the studies ofeach particular mission. Future data archives will makethe data accessible to a wider range of users and willencapsulate the knowledge of the mission data set inthe archive to benefit other users. Sponsors are requir-ing that data sets be made available to give more returnon the mission costs. The challenge will be to meet costand science goals of active missions while providing thesystems to deliver these data to the community fromdata servers on the Internet to portable laptops employ-ing archives on CD-ROM.

Processing and visualization tasks are currently con-strained by the system and the software developmenttime required to best use the bounty of available hard-ware and software tools. The solution lies in increasedspecialization of people and of development areas thatwill allow better use of this focused knowledge. Thesesegmented tasks will increase the ability of softwaredesigners to adjust to science and technology advances.

The design of data processing and visualization sys-tems is practiced in an environment of change in whichthe field of space science is advancing and the field ofcomputer science is refining into an engineering disci-pline. Given the nature of scientific research to avoidrepetitive studies, designs may not be wholly duplicatedfrom one mission to the next. Because of the continualadvances in computer technology, no design can beexpected to remain unaltered, even for the life of amission. Scientific research, as an application of com-puter resources, encourages these new designs to betterfacilitate its advancement. In this environment ofchange, the future challenges of our group will be tocontinue to optimally apply advanced technologies andmethodologies of computer science to enhance theresearch returns of Space Department missions.

REFERENCES1Bostrom, C. O., and Ludwig, G. H., “Instrumentation for Space Physics,”Physics Today 19(7), 43–56 (1966).

2Suther, L. L., “Imaging the Solar System with Computers,” Johns Hopkins APLTech. Dig. 10(3), 238–245 (1989).

3Preserving Valuable Space Data, United States General Accounting OfficeReport, GAO/IMTEC-90-0 Washington, DC (1990).

4Holland, B. B., Utterback, H. K., and Nylund, S. R., “The AMPTE ChargedComposition Explorer Science Data System,” IEEE Trans. Geosci. Remot.Sens. GE-23(3), 212–215 (1985).

NS HOPKINS APL TECHNICAL DIGEST, VOLUME 20, NUMBER 4 (1999)

5Schmid, C. F., Handbook of Graphic Presentation, The Ronald Press Company,New York (1954).

6Eick, S. G., “Information Visualization,” Presented at The New CyberLandscape Seminar, JHU/APL, Laurel, MD (13 Nov 1998), available at http://www.jhuapl.edu/cybertech/ct_5/eich_frm.htm (accessed 21 Jul 1999).

7Tufte, E. R., The Visual Display of Quantitative Information, Graphics Press,Cheshire, CT (1983).

JOHNS HOPKINS APL TECHNICAL DIGEST, VOLUME 20, NUMBER 4 (19

SPACE SCIENCE DATA PROCESSING AND VISUALIZATION

ACKNOWLEDGMENTS: We would like to acknowledge the late Dr. ThomasPotemra for his support of space science data processing. As with several SpaceDepartment groups, the Software and Data Systems Group grew from Dr. Potemra’sgroup. We benefited greatly from his vision, extraordinary enthusiasm for spacephysics, and mentoring. We thank Jacob Gunther for supplying information andacknowledge Tech Choo, Martha Kuster, and Joseph Skura for imagesused in this article, which resulted from their software development.

STUART R. NYLUND received his B.S. degree in physics from Iowa StateUniversity in 1974 and his M.S. degree in computer science from The JohnsHopkins University in 1979. He is a Senior Professional Staff member of APL’sSoftware and Data Systems Group. He began his career at APL in 1974by working in the Submarine Technology Department on developing processingand analysis software for radar and oceanographic studies. Since joining theSpace Department in 1982, Mr. Nylund has been involved in the developmentof ground data processing and analysis systems for a number of spacecraftmissions, including AMPTE/CCE, Delta 183, Galileo, Geotail, and ACE. Hise-mail address is [email protected].

THE AUTHORS

DOUGLAS B. HOLLAND received his B.S. degree in computer science fromthe University of Maryland in 1985. He is a Senior Professional Staff member ofAPL’s Software and Data Systems Group and Supervisor of the Ground SystemsSoftware Section. Mr. Holland is involved in several areas of NEAR missionsoftware development. His e-mail address is [email protected].

99) 543