wrap regional modeling center, attribution of haze meeting, denver 7/22/04 wrap 2002 visibility...

Download WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 WRAP 2002 Visibility Modeling: Emission, Meteorology Inputs and CMAQ Performance

If you can't read please download the document

Upload: monica-morrison

Post on 17-Jan-2018

217 views

Category:

Documents


0 download

DESCRIPTION

WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 MM5 Modeling Domain (36 & 12 km) National RPO grid –Lambert conic Projection –Center: -97 o, 40 o –True lat: 33 o, 45 o MM5 domain –36 km: (165, 129, 34) –12 km: (220, 199, 34) 24-category USGS data –36 km: 10 min. (~19 km) –12 km: 5 min. (~9 km)

TRANSCRIPT

WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 WRAP 2002 Visibility Modeling: Emission, Meteorology Inputs and CMAQ Performance Evaluation Gail Tonnesen, Bo Wang, Chao-Jung Chien, Zion Wang, Mohammad Omary University of California, Riverside Zac Adelman, Andy Holland University of North Carolina Ralph Morris et al. ENVIRON Corporation Int., Novato, CA WRAP Attribution of Haze Meeting, Denver, CO July 22, 2004 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Summary of RMC 2002 Modeling Annual MM5 Simulations run at the RMC Emissions processed with SMOKE Preliminary 2002 Scenario C used here. CMAQ version 4.3 (released October 2003) Data summaries, QA, results are posted on the RMC web page: WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 MM5 Modeling Domain (36 & 12 km) National RPO grid Lambert conic Projection Center: -97 o, 40 o True lat: 33 o, 45 o MM5 domain 36 km: (165, 129, 34) 12 km: (220, 199, 34) 24-category USGS data 36 km: 10 min. (~19 km) 12 km: 5 min. (~9 km) WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 MM5 Physics Physics OptionConfigurationConfigure.user MicrophysicsReisner2 (with graupel)IMPHYS = 7 Cumulus SchemeKain-FritschICUPA = 6 PBLPleim-Chang (ACM)IBLTYP = 7 RadiationRRTMFRAD = 4 Land-surface modelPleim-XiuISOIL = 3 Shallow ConvectionNoISHALLO = 0 Snow Cover EffectSimple snow modelISNOW = 2 Thermal RoughnessGarratIZ0TOPT = 1 Varying SSTYesISSTVAR = 1 Time step90 seconds(PX uses an internal timestep of 40 seconds) WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Subdomains for 36/12-km Model Evaluation 1 = Pacific NW 2 = SW 3 = North 4 = Desert SW 5 = CenrapN 6 = CenrapS 7 = Great Lakes 8 = Ohio Valley 9 = SE 10 = NE 11 = MidAtlantic WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Evaluation Review Evaluation Methodology Synoptic Evaluation Statistical Evaluation using METSTAT and surface data WS, WD, T, RH Evaluation against upper-air obs Statistics: Absolute Bias and Error, RMSE, IOA (Index of Agreement) Evaluation Datasets: NCAR dataset ds472 airport surface met observations Twice-Daily Upper-Air Profile Obs (~120 in US) Temperature Moisture WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 METSTAT Evaluation Package Statistics: Absolute Bias and Error, RMSE, IOA Daily and, where appropriate, hourly evaluation Statistical Performance Benchmarks Based on an analysis of > 30 MM5 and RAMS runs Not meant as a pass/fail test, but to put modeling results into perspective WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Evaluation of 36-km WRAP MM5 Results Model performed reasonably well for eastern subdomains, but not the west (WRAP region) General cool moist bias in Western US Difficulty with resolving Western US orography? May get better performance with higher resolution Pleim-Xiu scheme optimized more for eastern US? More optimization needed for desert and rocky ground? MM5 performs better in winter than in summer Weaker forcing in summer July 2002 Desert SW subdomain exhibits low temperature and high humidity bias 2002 MM5 Model Evaluation 12 vs. 36 km Results Chris Emery, Yiqin Jia, Sue Kemball-Cook, and Ralph Morris (ENVIRON International Corporation) & Zion Wang (UCR CE-CERT), Western Regional Air Partnership (WRAP) National RPO Meeting, May 25, 2004 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 WRAP 36km/12km July Wind Performance Comparison Wind Speed RMSE (m/s) Wind Direction Error (degrees) Benchmark12 km SubdomainsMM5/RAMS Runs36 km Subdomains DesertSW North SW PacNW WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 MM5 Implications for AoH The RMC is continuing to test alternative MM5 configurations to be completed at the end of Expect some reduction in bias &error in the WRAP states, however even in the best case we will have error & bias in MM5 that must be considered when using CMAQ for source attribution. WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Emissions Inventory Summary Preliminary 2002 Scenario C based on the 1996 NEI, grown to 2002, with many updates by WRAP contractors and other RPOs. Processed for CMAQ using SMOKE. Extensive QA plots on the web page Both SMOKE QA and post-SMOKE QA WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Emissions Sources by Category & RPO WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 WRAP 2002 Annual NOx Emissions Area Biogenic On Road Non Road Road Dust Point Rx Fire Ag Fire Wildfire Offshore WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/ WRAP NOx Emissions by Source & State Arizona California Colorado Idaho Montana Nevada New Mexico North Dakota Oregon South Dakota Utah Washington Wyoming [Tons/Yr] Ag Fire Rx Fire Wildfire Area Point Nonroad Onroad WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 WRAP 2002 Annual SO2 Emissions Area Biogenic On Road Non Road Road Dust Point Rx Fire Ag Fire Wildfire Offshore WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/ WRAP SO2 Emissions by Source & State 0.00E E E E E E E+05 Arizona California Colorado Idaho Montana Nevada New Mexico North Dakota Oregon South Dakota Utah Washington Wyoming [Tons/Yr] Onroad Ag Fire Rx Fire Wildfire Area Nonroad Point WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/ WRAP NH3 Emissions by Source Category 0.00E E E E E E+05 Arizona California Colorado Idaho Montana Nevada New Mexico North Dakota Oregon South Dakota Utah Washington Wyoming Tons/Yr Nonroad Ag Fire Rx Fire Point Onroad Wildfire Area WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Emissions Summary Preliminary 2002 EI Used here. Updates for final 2002 EI will include: New EI data from other RPOs and Canada 2002 NEI to replace grown 1996 NEI Reprocess in SMOKE with final MM5 All final inputs ready now except Canada & MM5 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 CMAQ Simulations CMAQ v km grid, 112x148x19 Annual Run CB4 chemistry Evaluated using: IMPROVE, CASTNet, NADP, STN, AIR/AQS WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 PM Performance Criteria Guidance from EPA not yet ready: Difficult to assert that model is adequate. Therefore, we use a variety of ad hoc performance goals and benchmarks to display CMAQ results. We completed a variety of analyses: Compute over 20 performance metrics Scatter-plots & time-series plots Soccer plots Bugle plots WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Goal of Model Evaluation We completed a variety of analyses: Compute over 20 performance metrics Scatter-plots & time-series plots Soccer plots Bugle plots Goal is to decide whether we have enough confidence to use the model for AoH: Is this a valid application of the model? WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Soccer Goal Plots Plot error as as a function of bias. Ad hoc performance goal: 15% bias, 35% error based on O3 modeling goals. Larger error & bias are observed among different PM data methods and monitoring networks. Performance benchmark: 30% bias, 70% error (2x performance goals) PM models can achieve this level in many cases. WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Annual CMAQ vs IMPROVE WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Spring Summer FallWinter WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Annual CMAQ vs CASTNet WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Spring Summer FallWinter WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Annual CMAQ vs STN WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Spring Summer FallWinter WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Annual CMAQ vs NADP WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Spring Summer FallWinter WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Performance Goals and Criteria - Proposed by Jim Boylan Based on FE and FB calculations Vary as a function of species concentrations Goals: FE +50% and FB 30% Criteria: FE +75% and FB 60% Less abundant species should have less stringent performance goals and criteria WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Performance Goals and Criteria - Proposed by Jim Boylan PM Performance Goals Proposed PM Performance Criteria WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly SO4 Fractional Bias WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly SO4 Fractional Error WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly NO3 Fractional Bias WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly NO3 Fractional Error WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly NH4 Fractional Bias WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly NH4 Fractional Error WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly OC Fractional Bias WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly OC Fractional Error WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly EC Fractional Bias WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly EC Fractional Error WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly PM25 Fractional Bias WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Monthly PM25 Fractional Error WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 CMAQ & EI Versions TSSA results are run in CMAQ v4.4 with emissions version Preliminary 2002 C Performance evaluation used CMAQ 4.3 Previous CMAQ runs used CMAQ 4.3 with Preliminary 2002 B emissions (no fires) WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 CMAQ v4.3 & v4.4 versus IMPROVE July WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 CMAQ Ozone Performance CMAQ v4.3 Mean fractional bias (no filter) January +25% MFB July 20% mean MFB Slightly worse January O3 performance in v4.4 WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 CMAQ Emissions B & C versus IMPROVE Summer WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Issues for AoH Is this set of Emissions/MM5/CMAQ adequate for studying AoH? Analysis of CMAQ performance on best & worst days still in progress: However, we expect CMAQ will tend to over predict lows & under predict highs. Should we use CMAQ results unpaired in time? WRAP Regional Modeling Center, Attribution of Haze Meeting, Denver 7/22/04 Options for future work Continue CMAQ source apportionment with current data sets. Wait for new MM5 and emissions. Investigate other CMAQ configurations: Unlikely to see large improvements