some thoughts on the validation of fire products ivan csiszar umd

Post on 19-Jan-2016

231 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Some thoughts on the validation of fire products

Ivan CsiszarUMd

Topics• Active fire products

– Binary yes/no detection– Sub-pixel size and temperature– Fire radiative power and energy

• Burned area products– Binary yes/no detection– Burn severity

• Validation approaches– direct validation– indirect validation (input as source – check derived parameter)

• Spatial accuracy

What is a “fire”?• Producers (remote sensing community)

– the smallest mapping/sensing unit with detectable integrated “amount” of fire

• location: flagging pixels for active fire or burned area– small active fires may not be flagged as burned area

• summary sub-pixel characteristic: size/temp., FRP, burn severity

• User community– fire of interest

• larger than the “smallest” actionable fire• larger than the “smallest” non-negligible fire event either

individually or aggregated in space and/or time• … and its characteristics

The validation process

• 1. Determine “absolute” detection limits– depend on a wide range of circumstances

• 2. Relate “absolute” detection limits to regional fire characteristics and user requirements

Validation strategy• Needs to be driven by user requirements• Metrics need to be meaningful for users• Pixel-scale yes/no fire detection: not a continuous variable

per se– rather: probabilities of detection, detection and false alarm rates +

uncertainty

• Cumulative statistics over larger areas– i.e. number of fire pixels vs. number of fires– total burned areas

• Fire characteristics: continuous variables– bias + error bar

• Sampling strategy– core sites – rather broadly defined target areas

• stratify by fire regime

Active fire “yes/no” product validation

• Producers’ accuracy measures– detection capabilities and false alarm rates

• theoretical: radiative transfer modeling– wide range of circumstances– how accurate and realistic are they?

» in-situ or high resolution remote sensing reference data• empirical:

– word of mouth– visual inspection – sanity check– using in-situ or high resolution remote sensing data as

reference» logistical difficulties in collecting coincident reference data» difficulties in selecting proper methodology (fire

parameters, metrics, statistical model etc.)» limited set of circumstances

• fusion of theoretical & empirical – anchor points to support simulation-based assessment

Theoretical detection envelopes from radiative transfer simulation

L. GiglioMODIS active fire product

Empirical detection envelopes from comparison with high-resolution

satellite data

ASTER fire counts

Pro

ba

blit

y M

OD

IS d

ete

ctio

n

0 50 100 150

0.0

0.2

0.4

0.6

0.8

1.0

Estimated Probabilities from Model 1, MODIS version 4

Solid line shows the random effect model

0 50 100 150

0.0

0.2

0.4

0.6

0.8

1.0

J. Morisette

ASTER + MODIS grid

Ground truth• Ideally, ground truth information

– is coincident with the satellite observation– includes information on all circumstances that affect

detection• spatially explicit temperature field within the satellite footprint

(flaming, smoldering, pre- and post-burn)• atmospheric conditions (cloudiness etc.)• etc.

• Information needs are similar for the validation of “yes/no” detection and sub-pixel retrievals

• We rarely or never get this

Collection of “ground-truth” data

• In-situ (ground or aircraft)– large sample collected in collaboration with

fire management agencies• coordinated effort• institutional obstacles• protocols needed – what an how is recorded

– prescribed (controlled) burns• more detailed data, but very limited sample

– useful for algorithm calibration, realistic mapping of conditions for simulations

– hardly useful for statistical analysis

In-situ vs. satellite burned area maps

Siberia; red: in-situ, blue: AVHRR

Is our active fire detection within the time bracket and within (reasonable distance of) the burned area?

•In-situ •location•area •start date•end date

In-situ observations

0

100

200

300

400

500

600

700

800

900

1000

10 10.2 10.4 10.6 10.8 11 11.2

Time (hours)

Tem

pera

ture

(deg

C) 2 m

1 m

ground

Collection of “ground-truth” data• Moderate or high resolution satellite sensors

– opportunistic (aircraft also to some extent)• may not be optimal sensing conditions (sensor gain

setting etc.)• still may not be statistically representative

– easier for geostationary than for polar

– prescribed (controlled) burns• very limited sample

– useful for algorithm calibration, realistic mapping of conditions for simulations

– hardly useful for statistical analysis

• difficulties in scheduling coincident observations– less and issue for geostationary!

Scaling up

ASTER

Airborne

Terra and BIRD orbits

Active fire “yes/no” validation

• Users’ accuracy measures– depends on application and the

corresponding definition of “fires of interest”– relates to local/regional fire regime

• detection rates (omission errors)• false alarm rates (commission errors)

– sensor/gridcell resolution!

– need to have statistics of local/regional fire regimes

– needs to relate to a parameter that is retrievable from satellites

Users’ accuracy statement

Southern Africa

ASTER fire counts

Pro

ba

blit

y M

OD

IS d

ete

ctio

n

0 50 100 150

0.0

0.2

0.4

0.6

0.8

1.0

Estimated Probabilities from Model 1, MODIS version 4

Solid line shows the random effect model

0 50 100 150

0.0

0.2

0.4

0.6

0.8

1.0

detection ratesfalse alarm rates

Note: this can be another remote sensing product!The same process needs to be done at this scale.

This can also be fed into simulations

Validation hierarchy

Stage 1 Validation:  Product accuracy has been estimated using a small number of independent measurements obtained from selected locations and time periods and ground-truth/field program effort.

Stage 2 Validation: Product accuracy has been assessed over a widely distributed set of locations and time periods via several ground-truth and validation efforts.

Stage 3 Validation: Product accuracy has been assessed and the uncertainties in the product well established via independent measurements in a systematic and statistically robust way representing global conditions.

Validation hierarchy•Beta Data Product:

•early release product, minimally validated and may still contain significant errors•available to allow users to gain familiarity with data formats and parameters•product is not appropriate as the basis for quantitative scientific publications

•Provisional Data Product:•product quality may not be optimal•incremental product improvements are still occurring•general research community is encouraged to participate in the validation and QA of the product, but need to be aware that product•validation and QA is ongoing•users are urged to contact science team representatives prior to use of the data in publications •may be replaced in the archive when the validated product becomes available

•Validated Product:•formally validated product, although validation is still ongoing•uncertainties are well defined•ready for use in scientific publications, and by other agencies•there may be later improved versions•earlier validated versions will be deleted from the archive after a 6 month overlap period, but code for earlier versions will be maintained indefinitely

Validation reporting

• A distributed fire product needs to be accompanied by– a statement on its validation status– product maturity stage– quantifiable information on product accuracy,

using accuracy measures that are useful for that specific user group

– can be complex and overwhelming

Summary• Validation is a two-step process

– includes several activities– fusion of empirical and theoretical approaches

• Most satellite-based fire products– have been validated only up to Stage 1– product maturity status is “Provisional”

• Further work is needed to– strengthen institutional collaboration between the fire research and

management communities to foster the exchange and in-situ reference data and improved satellite-based active fire products

– develop validation procedures and protocols• role of GOFC/GOLD and CEOS LPV• distribution of validation datasets

– develop sampling strategy

• Validation: tool for inter-satellite comparison

top related