regional haze modeling: recent modeling results for vistas and wrap

30
Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP October 27, 2003, CMAS Annual Meeting, RTP, NC University of California, Riverside

Upload: fruma

Post on 08-Jan-2016

46 views

Category:

Documents


2 download

DESCRIPTION

Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP. University of California, Riverside. October 27, 2003, CMAS Annual Meeting, RTP, NC. UC Riverside : Gail Tonnesen, Zion Wang, Chao-Jung Chien, Mohammad Omary, Bo Wang Ralph Morris et al., ENVIRON Corporation - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

Regional Haze Modeling:Recent Modeling Results for

VISTAS and WRAP

October 27, 2003, CMAS Annual Meeting, RTP, NC

University of California, Riverside

Page 2: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

Modeling Team Participants

• UC Riverside: Gail Tonnesen, Zion Wang, Chao-Jung Chien, Mohammad Omary, Bo Wang

• Ralph Morris et al., ENVIRON Corporation• Zac Adelman et al., Carolina Environmental

Program• Tom Tesche et al., Alpine Geophysics• Don Olerud, BAMS

Page 3: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

Acknowledgments

• Western Regional Air Partnership: John Vimont, Mary Uhl, Kevin Briggs, Tom Moore,

• VISTAS: Pat Brewer, Jim Boylan, Shiela Holman

Page 4: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

Topics

• Model Performance Evaluation

• WRAP 1996 Model Performance Evaluation

• VISTAS 2002 Sensitivity Results

• CMAQ Benchmarks

Page 5: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

WRAP Modeling

• 1996 Annual Modeling

• 36 km grid for western US, 95x85x18 layers

• MM5 by Olerud et al.

Page 6: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

WRAP Emissions Updates

• Corrections to point sources• MOBILE6 beta for WRAP states• Monthly corrections for NH3 based on EPA/ORD

inverse modeling.• Updated non-road model• Typical fires used for results shown here• 1996 NEI for non-WRAP states

Page 7: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

WRAP - CMAQ revisions

• v0301, released in March 2001– Used as the base case and all sensitivity cases for

WRAP’s 309 simulations.

• v0602, released in June 2002• v4.2.2, released in March 2003• v4.3, released in Sept. 2003

Page 8: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

Mean Normalized Bias (Yearly)

-200

0

200

400

600

800

1000

1200

1400

1600

1800

SO4 NO3 NH4 OC EC SOIL CM PM25 RCFM PM10 Bext

(%)

v0301 v422

v43

Comparisons based on IMPROVE evaluation

Page 9: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

• How well does the model reproduces mean, modal, and variational characteristics ?– Using observations to normalize model error &

bias result in misleading conclusion:• if observation is very small large bias or error• if model under prediction bounded by -1• model over prediction is weighted more than under

prediction

• We used Mean Normalized Err & Bias in 309:– Poor metric for clean conditions

Model Performance Metrics

Page 10: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

• Use fractional error and bias:– bias and error is bounded symmetrical limits of +2

• Normalized Mean Error & Bias:– Divide the sum of the errors by the sum of the

observations.

• Coefficient of determination (R2)– explains how much of the variability in the model

predictions can be explained by the fact that they are related to ambient observation, i.e. how close the points are to the observations.

Recommended Performance Metrics

Page 11: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

Statistical measures used in model performance evaluation

Measure Mathematical Expression

Notation

Accuracy of unpaired peak (Au)Opeak = peak observation; Pu

peak= unpaired peak prediction within 2 grid cells of peak observation site

Accuracy of paired peak (Ap)P = paired in time and space peak prediction

Coefficient of determination

Pi = prediction at time and location i;

Oi =observation at time and location i;

=arithmetic average of Pi, i=1,2,…, N;

=arithmetic average of Oi, i=1,2,…,N

Normalized Mean Error (NME) Reported as %

Root Mean Square Error (RMSE)

Fractional Gross Error (FE)

peak

peakupeak

O

OP

peak

peak

O

OP

N

i

N

iii

N

iii

OOPP

OOPP

1 1

22

2

1

)()(

))((

N

ii

N

iii

O

OP

1

1

2

1

1

21

N

iii OP

N

N

i ii

ii

OP

OP

N 1

2

PO

Page 12: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

Statistical measures used in model performance evaluation

Measure Mathematical Expression

Notation

Mean Absolute Gross Error (MAGE)

Mean Normalized Gross Error (MNGE); Mean Normalized Error (MNE)

Reported as %

Mean Bias (MB)

Mean Normalized Bias (MNB) Reported as %

Mean Fractionalized Bias (Fractional Bias, MFB)

Reported as %

Normalized Mean Bias (NMB) Reported as %

N

iii OP

N 1

1

N

i i

ii

O

OP

N 1

1

N

iii OP

N 1

1

N

i i

ii

O

OP

N 1

1

N

i ii

ii

OP

OP

N 1

2

N

ii

N

iii

O

OP

1

1

)(

Page 13: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

Statistical measures used in model performance evaluation

• In addition…– Mean observation– Mean prediction– Standard deviation (SD) of observation– Standard deviation (SD) of prediction– Correlation variance

Page 14: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

Expanded Model Evaluation Software to include…• Ambient data evaluation for air quality monitoring networks:

– IMPROVE (24-Hour average PM)– CASTNet (Weekly average PM & Gas)– STN (24-Hour average PM)– AQS (Hourly Gas)– NADP (weekly total deposition)– SEARCH

• 17 statistical measures in model performance evaluation• All performance metrics can be analyzed in an automated

process for model and data selected by:· allsite_daily · onesite_daily

· allsite_yearly · onesite_monthly

· allsite_monthly · onesite_yearly

Page 15: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

• Facilitate model evaluation.

• Benefit from shared development of tool.

• Share monitoring data.

• UCR software available at website:

www.cert.ucr.edu/aqm

Community Model Evaluation Tool?

Page 16: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

SO4, Monthly Statistical Measures

-50

0

50

100

150

200

250

1 2 3 4 5 6 7 8 9 10 11 12

Month

%

Normalized Mean Error (%)

Mean Normalized Gross Error (%)

Mean Normalized Bias (%)

Mean Fractionalized Bias (%)

Normalized Mean Bias (%)

WRAP 1996 Evaluation, CMAQ v4.3

Page 17: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

NO3, Monthly Statistical Measures

-200

0

200

400

600

800

1000

1200

1 2 3 4 5 6 7 8 9 10 11 12

Month

%

Normalized Mean Error (%)

Mean Normalized Gross Error (%)

Mean Normalized Bias (%)

Mean Fractionalized Bias (%)

Normalized Mean Bias (%)

WRAP 1996 Evaluation, CMAQ v4.3

Page 18: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

OC, Monthly Statistical Measures

-50

0

50

100

150

200

250

300

1 2 3 4 5 6 7 8 9 10 11 12

Month

%

Normalized Mean Error (%)

Mean Normalized Gross Error (%)

Mean Normalized Bias (%)

Mean Fractionalized Bias (%)

Normalized Mean Bias (%)

WRAP 1996 Evaluation, CMAQ v4.3

Page 19: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

EC, Monthly Statistical Measures

-100

-50

0

50

100

150

200

250

300

1 2 3 4 5 6 7 8 9 10 11 12

Month

%

Normalized Mean Error (%)

Mean Normalized Gross Error (%)

Mean Normalized Bias (%)

Mean Fractionalized Bias (%)

Normalized Mean Bias (%)

WRAP 1996 Evaluation, CMAQ v4.3

Page 20: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

WRAP 1996 cases in progress

• New fugitive dust emissions model

• New NH3 emissions model

• Actual Prescribed & Ag burning emissions

• 2002 annuals simulations being developed.

Page 21: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

VISTAS Model 12 km Domain

• 34 L MM5 by Olerud

• 1999 NEI

• CMAQ v3

Page 22: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

VISTAS Sensitivity Cases

• 3 Episodes: Jan 2002, July 1999, July 2001• Sensitivity Cases

– MM5 MRF and ETA-MY, – PBL height, Kz_min, Layer collapsing– CB4-2002– SAPRC99– CMAQ-AIM– GEO-CHEM for BC– NH3 emissions

Page 23: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

VISTAS Key Findings

• NO3 over predictions in winter, under predictions in summer.– Thorton et al N2O5 had small benefit, July

MNB increased from –50% to –45%

• SO4 performance reasonably good• Problems with PBL height

– Kz_min = 1 improved performance– Investigating PBL height corrections

• Minor differences in 19 vs 34 layers

Page 24: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

Benchmarks

• Athlon MP 2000 (1.66 GHz)

• Opteron 246 (2.0 GHz)– 32 bit code– 64 bit code

• Compare 1, 4 and 8 CPUs.

• Ported CMAQ to the 64 bit SuSE– Pointers & memory allocation for 64 bit

Page 25: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

Test Case for benchmarks

• VISTAS 12 km domain– 168 x 177 x 19 layers

• Benchmarks for CMAQ 4.3• One day simulation, CB4, MEBI• Single CPU run time hour:minutes

– Athlon 2 GHz: 14:10– Opteron 32bit 2 GHz: 12:49– Opteron 64 bit 2 GHz: 10:57

Page 26: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

Clock Time by CPU Type

0

200

400

600

800

1000

1200

1 2 3 4 5 6 7 8

number CPUs

Tim

e (m

inu

tes)

Athlon 1.66 GHzOpteron-32Opteron-64Athlon 2.13 GHz

Page 27: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

Parallel Scaling by CPU type

0.125

0.150

0.175

0.200

0.225

0.250

0.275

0.300

0.325

2 4 6 8

number CPUs

Sc

ali

ng

Ra

tio

AthlonOpteron-32Opteron-64Perfect Scaling

Page 28: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

Parallel Scaling by CPU Type

1.00

1.10

1.20

1.30

1.40

1.50

1.60

1.70

1.80

1 2 3 4 5 6 7 8

number CPUs

actual/perfect

Athlon MPOpteron-32Opteron-64

Page 29: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

Optimal Cost Configuration

• Small cluster < 8 CPUs use Athlon

• Large cluster >16 CPUs use Opterons?

Page 30: Regional Haze Modeling: Recent Modeling Results for VISTAS and WRAP

Conclusions

• Major Improvements in WRAP 1996 Model

• WRAP 2002 annual modeling underway

• VISTAS Sensitivity Studies– still have problems in NO3– Need better NH3 inventory– Need more attention to PBL heights in MM5

• Community model evaluation tool?