project id: 19-0036...sesd id: 19-0036 final report page 7 of 81 • teom (tapered element...
TRANSCRIPT
EPA ID: 17-0110 Draft Report Page 1 of 81
Project ID: 19-0036
2019 Technical Systems Audit Report
Knox County Department of Air Quality Management Knoxville, Tennessee Project Date: February 4 – 7, 2019 Report Date: March 19, 2019
Project Leader: Adam Zachary Superfund and Air Section Field Services Branch Science & Ecosystem Support Division USEPA – Region 4 980 College Station Road Athens, Georgia 30605-2720
The activities depicted in this report are accredited under the US EPA Region 4 Science and Ecosystem Support Division ISO/IEC 17025 accreditation issued by the ANSI-ASQ National Accreditation Board. Refer to certificate and scope of accreditation AT-1644.
SESD ID: 19-0036 Final Report Page 3 of 81
Table of Contents
1.0 Executive Summary ............................................................................................................. 4
2.0 Introduction .......................................................................................................................... 5
3.0 Commendations ................................................................................................................... 7
4.0 Findings and Recommendations .......................................................................................... 8
4.1 Field Operations ..............................................................................................................10
4.2 Laboratory Operations.....................................................................................................13
4.3 Records Management ......................................................................................................13
4.4 Data Management ...........................................................................................................13
4.5 Quality Assurance ...........................................................................................................16
5.0 Conclusions ........................................................................................................................ 21
Appendix 1 .................................................................................................................................... 23
SESD ID: 19-0036 Final Report Page 4 of 81
1.0 Executive Summary
U.S. Environmental Protection Agency Region 4 Science and Ecosystem Support Division (EPA)
personnel conducted a Technical Systems Audit (TSA) of the Knox County Department of Air
Quality Management (KCDAQM) ambient air monitoring organization in February 2019. The
purpose of the TSA was to evaluate the operation and performance of the KCDAQM air
monitoring program, pursuant to 40 CFR Part 58, Appendix A, § 2.5. Data from the 2016-2018
calendar years were reviewed during the TSA.
KCDAQM is commended for continuing to develop a strong ambient air monitoring program.
KCDAQM staff demonstrated technical proficiency regarding the instrumentation as well as their
roles and responsibilities. Traceability and certification documentation for the criteria pollutant
monitors were easily located and provided to the EPA. There have been recent strides taken to
continue to improve and enhance the air monitoring program such as bolstering the employee
training program (i.e., training videos), development of a Quality Assurance (QA) database to
streamline data certification and data review. The agency also joined the national contract with
Eastern Research Group (ERG) for Lead (Pb) analysis.
KCDAQM currently operates seven State or Local Air Monitoring Stations (SLAMS). During the
TSA, six of the seven SLAMS sites were evaluated for compliance to siting criteria pursuant to 40
CFR 58, Appendix E. Two out of the six active air monitoring stations were found to have
particulate sampler probes which did not meet established regulatory requirements for distance
and spacing.
Several of the Findings in this TSA report will require the invalidation or qualification of a limited
amount of ambient air concentration data reported to the AQS database. Data that does not meet
certain critical criteria for siting (i.e., Findings 4.1.1) nor data validation (i.e., Findings 4.4.1 -
4.4.5) are considered unusable for regulatory decision-making purposes. Further, criteria pollutant
data was being collected at SLAMS monitoring stations without a current, approved QAPP in
place for the 2016 – 2018 calendar years (i.e., Finding 4.5.1).
The primary issues documented in the report indicate a need for additional resources towards
ambient air monitoring instrumentation and quality assurance processes. In the PM2.5 network,
several air monitoring stations are operating with aging ambient air monitoring instruments (i.e.,
approximately 70% of the monitors being 14 years old). Manufacturers are no longer supporting
these samplers, thus, making continued operation in the network difficult due to a resource
limitation during instrument repair and annual maintenance. Further, there are no functioning
PM2.5 samplers to be deployed when a breakdown occurs to any sampler operating at the four air
monitoring stations. The aging samplers and lack of back up equipment are a potential
vulnerability for data capture and data completeness.
SESD ID: 19-0036 Final Report Page 5 of 81
The other primary concern is the need for more resources devoted to quality assurance. KCDAQM
operates an extensive ambient air monitoring network with one position dedicated to quality
assurance. The current level of data review does not meet this consideration. The agency must
augment the current data verification and validation processes to fortify against vulnerabilities
(i.e., reorganization of roles during the multi-tiered data review). Most Findings and Concerns
were directly affected by the amount of resources in these areas.
In general, KCDAQM staff operate an air monitoring program that is well-maintained and quality-
controlled. Data collected within KCDAQM’s air monitoring network is of sufficient quality for
regulatory decision-making purposes.
2.0 Introduction
On February 4 - 7, 2019, USEPA Region 4 personnel conducted a TSA of the KCDAQM ambient
air monitoring program. The audit team included Adam Zachary (lead auditor) and Keith Harris
from EPA Region 4 Science and Ecosystem Support Division (SESD).
The purpose of the audit was to assess KCDAQM’s compliance with established regulations
governing the collection, analysis, validation, and reporting of ambient air quality data. Pursuant
to 40 CFR Part 58, Appendix A, § 2.5, TSAs of each Primary Quality Assurance Organization
(PQAO) are required to be conducted every three years. Data reviewed as part of this TSA
included that generated during the 2016-2018 calendar years. Data was queried from USEPA’s
Air Quality System (AQS) database prior to the on-site audit. EPA’s Ambient Air Monitoring
Technical Systems Audit Form was completed by KCDAQM staff prior to the on-site audit and is
included as Appendix 1 of this report.
The audit included a review of data, recordkeeping, documentation, and support facilities housed
at the KCDAQM central office, located at 1403 Davanna Street, in Knoxville, Tennessee. Six of
the seven regulatory air monitoring stations operated by KCDAQM were visited during the audit
and the six stations are listed below.
Common Site Name AQS Identification
Air Lab 47-093-1013
Burnside 47-093-0027
Ameristeel 47-093-0023
Rule 47-093-1017
Bearden 47-093-0028
Spring Hill 47-093-1020
SESD ID: 19-0036 Final Report Page 6 of 81
During the audit, the following KCDAQM personnel were interviewed.
• Lynne Liddington, Director
• Brian Rivera, Environmental Program Manager
• Amber Talgo, Air Monitoring Manager
• Rebecca Larocque, Environmental Specialist
• Barron White, Environmental Specialist
• David Colvin, Environmental Specialist
The following AQS reports were reviewed in preparation for this TSA.
• AMP 251: QA Raw Assessment Report (2016-2018)
• AMP 256: QA Data Quality Indicator Report (2016-2018)
• AMP 350: Raw Data Report (2016-2018)
• AMP 380: Site Description Report (2016-2018)
• AMP 390: Monitor Description Report (2016-2018)
• AMP 430: Data Completeness Report (2016-2018)
• AMP 480: Design Value Report (2018)
• AMP 501: Extract Raw Data (2016-2018)
• AMP 503: Extract Sample Blank Data (2016-2018)
• AMP 504: Extract QA Data (2016-2018)
• AMP 600: Certification Evaluation and Concurrence (2016-2018)
Additionally, the following KCDAQM documents were reviewed.
• Quality Assurance Project Plan for the Knox County Air Quality Management Ambient Air
Monitoring Program, Revision 0, November 2018.
• Quality Assurance Project Plan for the Knox County, TN Ambient Air Quality Monitoring
Program, October 2009.
• Quality Management Plan, Knox County Department of Air Quality Management, Revision
3, September 2018.
• Ambient Air Monitoring Plan, Knox County, TN, Department of Air Quality Management,
April 2018.
• Standard Operating Procedure for Internal Audit of Monitors and Technical Systems,
Knox County Health Department, Revision 2, January 2017.
• Data Handling and Validation Standard Operating Procedure, Knox County Health
Department, Revision 0, May 2018.
SESD ID: 19-0036 Final Report Page 7 of 81
• TEOM (Tapered Element Oscillating Microbalance) FEM PM10, Thermo 1405a, Standard
Operating Procedures, Knox County Health Department, Revision 1, October 2016.
• Thermo Model 2025 Sequential Sampler, Standard Operating Procedures, Knox County
Health Department, Revision 0, March 2018.
• Ozone Monitoring with Teledyne 400E and 703E, Standard Operating Procedures, Knox
County Health Department, Revision 0, December 2016.
• Volumetric-Flow-Control (VFC) High Volume TSP/Pb Monitors, Standard Operating
Procedures, Knox County Health Department, Revision 0, April 2018.
• Quality Bulletin – Ozone Audit Procedures, Knox County Health Department, June 2018.
• Quality Bulletin – Increased Frequency of PM2.5 Leak Checks, Knox County Health
Department, June 2018.
• Quality Bulletin – Ozone Transfer Standard Back Pressure Compensation, Knox County
Health Department, June 2018.
• Quality Bulletin – TEOM Flow Verification Validation and Action Points, Knox County
Health Department, August 2017.
• Quality Bulletin – Rounding Convention During Verifications and Audits, Knox County
Health Department, August 2017.
• Quality Bulletin – Changes to Lead Auditing Procedure, Knox County Health Department,
August 2017.
3.0 Commendations
The KCDAQM ambient air monitoring program is currently operated and maintained exclusively
by three staff members; not including a vacant Environmental Specialist position. The staff are
tasked with managing all components of the air program, namely, network design, field operations
(e.g., instrument installation, maintenance, and repair), managing contractors (e.g., gravimetric
analysis of the PM2.5 filters by Inter-Mountain Laboratory (IML) and lead filters by Eastern
Research Group (ERG)), records and data management, quality control/quality assurance
(QA/QC) activities, and data reporting to the EPA’s AQS database. Despite all of these
responsibilities, the staff are meeting the regulatory requirements of the air monitoring program.
KCDAQM staff appeared proficient in and knowledgeable of their roles and responsibilities. The
staff’s commitment to producing quality data and having high data capture was evident during the
audit. At the ozone monitoring stations—Springhill and East Knox, KCDAQM invested in new
ozone monitors, installed a new ozone sampling system that allows automated through-the-probe
quality control (QC) checks. QC checks are being conducted nightly for improved data capture.
A few of KCDAQM’s improvements in the particulate matter network include discontinuing PM10
filter analysis by switching to continuous PM10 monitors and alternating the cooler shipping dates
with the courier to eliminate temperature variability, thus, increasing data capture. The agency
SESD ID: 19-0036 Final Report Page 8 of 81
discontinued PM10 analysis and began contracting lead (Pb) analysis via the national contract with
Eastern Research Group (ERG).
A significant effort has been dedicated to bolstering the employee training program. KCDAQM
has produced 13 training videos for site operators and quality assurance staff that cover a range of
related topics (e.g., logbooks and Hi Vol Filter removal and sampler setup). The training videos
are imbedded into the updated quality system documents as well. Further, KCDAQM developed
the Quality Assurance (QA) database to streamline data certification and data review and to
increase efficiency standards. The QA database performs various functions—tracks transfer
standard certification as well as its usage during field deployment, is a repository for field QC
forms, and searchable for any QC data that does not meet established acceptance criteria. Lastly,
all certification records for the standards were easily located and provided to the EPA.
Overall, KCDAQM has a strong monitoring program and continues to make progress in becoming
a model program.
4.0 Findings and Recommendations
The observations from this TSA were compared to USEPA regulations, technical policies and
guidance, and the KCDAQM quality system documentation.
Quality system deviations found through this TSA are classified into three categories: Findings,
Concerns, and Observations. These quality system deviations are defined as follows:
Finding:
Departure from or absence of a specified requirement (regulatory, QMP,
QAPP, SOP, etc.) or guidance deviation which could significantly impact
data quality.
Concern:
Practices thought to have potential detrimental effect on the ambient air
monitoring program’s operational effectiveness or the quality of sampling or
measurement results.
Observation:
An infrequent deviation, error, or omission which does not impact the output
of the quality of the work product, but may impact the record for future
reference.
For each of these categories, corrective action recommendations are provided. Corrective actions
are required for all quality system deviations ranked as Findings or Concerns. Depending on the
severity of the deviation, a specific data deliverable(s) may be requested to show that the corrective
action recommendation has been successfully implemented. In these cases, the TSA report will
SESD ID: 19-0036 Final Report Page 9 of 81
specify the deliverable(s) that will be required for AQS and/or submitted to EPA. Observations
do not require corrective actions.
SESD ID: 19-0036 Final Report Page 10 of 81
4.1 FIELD OPERATIONS
4.1.1 Finding: Air monitoring sites did not meet siting requirements stated in 40 CFR 58,
Appendix E. Two out of the six active air monitoring stations evaluated for 40 CFR Part
58, Appendix E siting criteria were found to have particulate sampler probes which did not
meet established regulatory requirements for distance and spacing.
Discussion: 40 CFR Part 58, Appendix E details the probe and monitoring path siting
criteria for ambient air quality monitors. As stated in Appendix E, Section 1, “Adherence
to these siting criteria is necessary to ensure the uniform collection of compatible and
comparable air quality data… Specific siting criteria that are phrased with a “must” are
defined as requirements and exceptions must be approved through the waiver provisions.”
The Appendix contains multiple sections that detail the spacing and distance requirements
for probe placement. The following paragraphs will summarize the issues observed during
the KCDAQM TSA in relation to these requirements.
a) Trees can provide surfaces for SO2, NO2, and ozone adsorptions or reactions, as well
as surfaces for particle deposition. Because of vegetation’s ability to scrub pollutants,
40 CFR Part 58, Appendix E, § 5 requires that 90% of a probe’s monitoring path be at
least 10 meters or more from the drip-line of trees. In the KCDAQM network, EPA
auditors observed the following sites at which monitoring inlets or probes did not meet
the minimum distance requirement: Burnside (i.e., 9.6 meters) and Ameristeel (i.e., 9.3
meters). Violations are due to annual growth and likely to continue to be an issue.
b) 40 CFR Part 58, Appendix E, § 4 details the requirements for spacing from
obstructions. Additionally, 40 CFR Part 58, Appendix E, § 5(a) states, “Trees can also
act as obstructions in cases where they are located between the air pollutant sources or
source areas and the monitoring site, and where the trees are of a sufficient height and
leaf canopy density to interfere with the normal airflow around the probe, inlet, or
monitoring path.” The 2017 version of the EPA Quality Assurance Handbook for Air
Pollution Measurement Systems, Volume II (QA Handbook) also discusses trees as
obstructions in Section 7.1. The QA Handbook further explains the rationale behind
the distance requirement: “It is important for air flow around the monitor to be
representative of the general air flow in the area to prevent sampling bias.” Trees were
observed in the KCDAQM network as being obstructions at Bearden (47-093-0028)
air monitoring station. Bearden has trees to the north and west that act as obstructions
and it is close to not meeting the CFR requirement that states PM2.5 samplers “must
have unrestricted airflow 270 degrees around the probe or sampler.” Bearden is
surrounded by trees, thus, will become more compromised in the future as foliage
increases. A more detailed assessment by KCDAQM is required to ensure this
regulatory requirement is met.
SESD ID: 19-0036 Final Report Page 11 of 81
Recommendation: For the sites where violations were observed, KCDAQM must address
these siting issues as quickly as possible. The trees may be removed or trimmed, the
sampler location(s) may be adjusted, or the sites may be relocated away from these
obstacles. Second, in regard to the data collected in the KCDAQM network, EPA
recommends data associated with the violating samplers be flagged in the AQS database.
EPA requires data flagging to begin with January 1, 2019, data, and flagged until such
date/time as evidence provided to EPA demonstrates these siting issues have been
corrected. The AQS “SX” QA qualifier flag (i.e., does not meet siting criteria) should be
applied to the impacted data. EPA requests copies of finalized AQS reports (i.e., AMP 350)
for the 2019 data set that show the application of this qualifier flag to the data from these
sites/monitors.
4.1.2 Concern: Several air monitoring stations are operating with aging ambient air monitoring
equipment, which could impact data completeness.
Discussion: The TSA questionnaire submitted to EPA prior to the audit described a list of
instrumentation needs for KCDAQM air monitoring network. The primary concern was in
the PM2.5 FRM network due to instrument age and failure, with approximately 70% of the
monitors being 14 years old (i.e., 2005 manufacturer’s date). The typical instrument
lifespan is 5 - 7 years. Due to the age of KCDAQM’s current instrumentation,
manufacturers are no longer supporting these samplers (i.e., Thermo 2025 FRM samplers).
This poses a vulnerability because spare parts used for instrument repair and annual
maintenance have become difficult to obtain for continued operation in the network.
One example of the PM2.5 dataset being impacted by aging instrumentation occurred at
Bearden (47-093-0028) in 2017. Samples did not meet critical criteria for filter temperature
excursions as defined in Appendix D validation templates of the QA Handbook. During
data review, EPA auditors observed AQS “X” qualifier flags, which indicate a filter
temperature difference, on multiple sampling dates for the primary sampler (POC 1) in
October and November (i.e., sample dates of 10/13-16, 19, 22, 25, 28, 31; 11/3, 6). The X
qualifier flag was applied, per 40 CFR 50, Appendix. L, § 7.4.11.4, following any
occurrence in which the filter temperature (any filter temperature for sequential samplers)
exceeds the ambient temperature by more than 5 °C for more than 30 consecutive minutes
during either the sampling or post-sampling periods of operation. The agency addressed
the temperature excursions by replacing a faulty instrument fan on November 8, 2017 and
used the AQS qualifier flag to signal the end user to the validity of the samples.
Further, in accordance with 40 CFR 50, Appendix. L, § 10.12: All factors related to the
validity or representativeness of the sample, such as sampler tampering or malfunctions,
SESD ID: 19-0036 Final Report Page 12 of 81
unusual meteorological conditions, construction activity, fires or dust storms, etc. shall be
recorded as required by the local quality assurance program. The occurrence of a flag
warning during a sample period shall not necessarily indicate an invalid sample but rather
shall indicate the need for specific review of the QC data by a quality assurance officer to
determine sample validity. Additionally, there were X qualifier flags applied to the
collocated sampler (POC 2) in March – August (3/8; 4/13, 20; 5/7, 20; 6/6, 12, 18, 24; 7/6,
12, 18, 25, 30; 8/5, 11, 17, 23).
The Rule (47-093-1017) ambient air monitoring station did not meet data completeness
goals (i.e., ≥75%) for the 1st and 2nd quarters of 2018 due to numerous malfunctions
attributed to the aging instrumentation. For example, a June 1, 2018 Corrective Action
Report (Serial Number SN2025B226541005) indicated the PM2.5 sampler had exchange
errors and an April 25, 2018 leak check did not meet acceptance criterion, thus, causing
the invalidation of 12 filters from March 24 – April 24, 2018 (i.e., Filter leak, AK, AQS
null code was applied). The leak check coupled with filter exchange errors affected data
completeness. During the site evaluation, the site operator and staff informed EPA auditors
that the instrument (Thermo 2025, SN2025B226541005) was known to exhibit recurring
filter exchange errors or not run when scheduled. As indicated in the July 27, 2018
Corrective Action Report (Serial Number SN2025B226541005), 2nd quarter data
completeness was < 75% due to a failed leak check and filter exchange errors from the
instrument. The site operator exchanged the filter pump due a decrease in pump efficiency
to operate the pneumatic system for filter exchange. The samples associated with the
corrective action received an AQS “V” qualifier flag (i.e., validated data).
Recommendation: Please provide EPA with a discussion of how the agency intends to
improve data capture and quality in the PM2.5 network (e.g., increased verification
frequencies, performance audits and/or a more robust maintenance schedule). EPA
recommends establishing a replacement schedule for the instrumentation that has been in
service for more than its intended lifespan.
4.1.3 Observation: Quality system documents and instrument manuals are not accessible at all
sites.
Recommendation: EPA recommends KCDAQM make the appropriate quality systems
documents accessible at all air monitoring stations. The availability of these documents
serves as a best practice and good resource for the operators to ensure consistency in quality
control. Although, operators are cross-trained on the various instrumentation, the ability to
quickly reference the quality system documents is invaluable.
SESD ID: 19-0036 Final Report Page 13 of 81
4.2 LABORATORY OPERATIONS
No laboratory operations were observed during the TSA. KCDAQM contracts PM2.5
laboratory operations to the Inter-Mountain Laboratory (IML). Additionally, KCDAQM
utilized ERG, the national contract, for analysis of criteria lead samples during the audit
period. No other criteria pollutant, laboratory-based methods are currently utilized by
KCDAQM.
4.3 RECORDS MANAGEMENT
No issues were identified with the management of air monitoring records during this
TSA. Records associated with the air monitoring equipment were found to be well-
organized and both easily and quickly accessible by KCDAQM staff at the main office.
4.4 DATA MANAGEMENT
4.4.1 Finding: Performance evaluation results were entered into incorrect audit level entry field
in the AQS database.
Discussion: EPA published a revision to 40 CFR Part 58 in March 2016. The requirements
by which annual performance evaluations are to be performed were revised. The previous
version used a five audit-level structure, a series of five concentration ranges from which
test atmosphere concentrations must be selected, and the rule required that an instrument
be challenged by three consecutive audit levels. The rule promulgated in March 2016
expanded the number of audit levels to ten and changed the rules dictating which levels are
to be selected. 40 CFR Part 58, Appendix A, § 3.1.2.1 states:
The evaluation is made by challenging the monitor with audit gas standards of
known concentration from at least three audit levels. One point must be within two
to three times the method detection limit of the instruments within the PQAOs
network, the second point will be less than or equal to the 99th percentile of the
data at the site or the network of sites in the PQAO or the next highest audit
concentration level. The third point can be around the primary NAAQS or the
highest 3-year concentration at the site or the network of sites in the PQAO. An
additional 4th level is encouraged for those agencies that would like to confirm the
monitors' linearity at the higher end of the operational range. In rare
circumstances, there may be sites measuring concentrations above audit level 10.
Notify the appropriate EPA region and the AQS program in order to make
accommodations for auditing at levels above level 10.
SESD ID: 19-0036 Final Report Page 14 of 81
A review of the results from ozone annual performance evaluations submitted to AQS (i.e.,
AMP 504: Extract QA Data and AMP 251: QA Raw assessment) showed that the low-
level requirement for O3 was being met in 2017 - 2018; however, the agency was not
entering the audit levels into the correct AQS audit field. KCDAQM uses the Teledyne
API Model 400 Analyzer, which has a Federal Reference and Equivalent Method
(FRM/FEM) code designation of EQOA-0992-087 (i.e., AQS reference method code of
087). According to the meta data in AQS, the minimum detection limit (MDL) for method
code “087”, the method utilized by the agency, is 0.005 ppm. Three times the MDL is 0.015
ppm, which falls into the second audit level, 0.006-0.019 ppm, according to 40 CFR Part
58, Appendix A, § 3.1.2.1 and the May 2016 OAQPS Technical Note- Guidance on
Identifying Annual PE Audit Levels Using Method Detection Limits and the 99th
Percentile. Although KCDAQM performed audits for the correct audit level in 2017 -
2018, the assessment and monitor concentrations for the low audit level were inserted
incorrectly into AQS audit fields (i.e., AQS audit field for concentrations in level 3 was
utilized with all concentrations corresponding to audit level 2). Further, the reported ozone
annual performance evaluations in AQS did not adhere to established procedures recorded
in Section 8.2.2.2 of KCDAQM’s Data Handling SOP, which states “Data should only be
entered into the Audit levels which were evaluated. Leave the others blank.”
Recommendation: Please correct the low audit level assessment and monitor
concentrations in AQS. Please provide EPA an AQS AMP 251 report showing that this
correction is made, once completed.
4.4.2 Finding: The verification results of the main flow rate for the continuous PM10 monitor
are not loaded into AQS, per 40 CFR Part 58.16.
Discussion: The QA Raw Assessment Report (AMP 251) summarizes the QA/QC data
reported by the agency. In reviewing the AMP 251 in preparation for this TSA, EPA
auditors observed that the two required verification results had not been submitted to AQS
for continuous PM10 total flow rate measurements (i.e., the Thermo Scientific TEOM 1405
Ambient Particulate Monitor flow rate at 3 and 16.7 LPM). AQS contained only the
verification results for total flow rate (i.e., 16.7 LPM). 40 CFR 58.16(a) requires the agency
to report to AQS all ambient air quality data and associated quality assurance data for the
various pollutant parameters, including continuous PM10. Per the FEM designation
(EQPM-1090-079) and the instrument manual, the Thermo TEOM 1405 has two flow rates
(i.e., main and total) required to be reported to AQS. The data in question had been
collected by KCDAQM, but had not been compiled for AQS submission.
Recommendation: In accordance with 40 CFR 58.16, the 2016 - 2018 ambient
concentrations and QC data for the parameter must be uploaded to AQS. Please provide
EPA with an AMP 251 report as a deliverable for this action item.
SESD ID: 19-0036 Final Report Page 15 of 81
4.4.3 Finding: PM2.5 semi-annual flow rate audit was identified in AQS that exceeded
acceptance criterion; however, the data remains in AQS.
Discussion: A PM2.5 semi-annual flow rate audit was performed on the primary sampler at
Bearden (47-093-0028) on November 20, 2018. The internal auditor recorded the results
in the logbook and on the PM2.5 Audit Calculations form as -5.6% difference. The internal
auditor drafted a November 29, 2018 memorandum titled 2018 Fourth Quarter Air
Monitoring Audit specifying the flow rate was not acceptable, the sampler had been
calibrated twice in 2018 and another calibration of the sampler was necessary. At the time
of the TSA, there was no corrective action report initiated for this flow rate audit; a site
operator performed a calibration after the audit. Further, there was a transcription error
regarding the transfer standard flow rate from the logbook and audit form compared to the
values entered into AQS (i.e., 17.67 LPM in the logbook and audit form but 17.76 LPM in
AQS). Nonetheless, the AMP 251 report showed audit results (i.e., -5.6 or -6.1 LPM) that
did not meet the acceptance criterion. The associated data affected by the audit remains in
AQS, although, the data should have been invalidated per the Quality Assurance Project
Plan for the Knox County Air Quality Management Ambient Air Monitoring Program,
Revision 0, November 2018.
Recommendation: Please follow KCDAQM quality system documents and correct the
samples affected by the flow rate acceptance criterion exceedance in AQS. Please provide
EPA an AQS AMP 350 report showing that this correction is made, once completed. EPA
recommends an additional reviewer for manual entry of data into AQS to minimize
transcription errors.
4.4.4 Finding: Data associated with a collocated PM10 sampler in 2016 should be removed from
AQS due to data quality concerns.
Discussion: During data review, significant data quality concerns were identified with the
collocated PM10 sampler at Air Lab (47-093-1013). For example, all data for the collocated
sampler in April and May of 2016 were assessed with an AQS “QX” qualifier flag, does
not meet QC criteria. During discussions with KCDAQM staff, it was discovered that the
QX flags were applied to ambient air data determined good through a statistical analysis as
a result of a failed QC check. Further, data completeness (i.e., 0%) was not met in the 1st
quarter due to a laboratory issue, discovered in the 2016 TSA (SESD Project #16-0048,
Finding 4.4.3), causing all related data to be invalidated.
Recommendation: EPA recommends the collocated PM10 sampler (POC 2) data be
removed from AQS due to the data quality concerns described above including insufficient
data capture to determine bias and accuracy. Please provide EPA an AQS AMP 350 report
showing that this correction is made, once completed.
SESD ID: 19-0036 Final Report Page 16 of 81
4.4.5 Finding: Reported ambient concentration in AQS did not meet the agency's quality system
standards.
Discussion: A Pb flow rate verification check performed on Sept 15, 2017 at the
Ameristeel (47-093-0023) air monitoring station yielded results (i.e., 9.4%) that did not
meet the acceptance criterion established in KCDAQM’s quality system documents. As a
result, data was removed from AQS using the performance date indicated on the quality
control form—Flow Verification Sheet. However, the date on the QC form was incorrect
and all impacted data was not removed from AQS due to this transcription error. There
were multiple places where the Pb flow rate verification date was recorded—the instrument
logbook (September 15, 2017) and the quality control form, Flow Verification Sheet,
(September 18, 2017). During the data verification and validation process on September
28, 2017, the date transcription error was recognized, investigated and corrected using the
date in the instrument logbook; the date on the Flow Verification Sheet was identified as
incorrect as evidence pointed to the logbook as being accurate. Nevertheless, all impacted
data was not removed from AQS due to the date transcription error, and consequently, data
was invalidated in AQS using the information on the uncorrected Flow Verification Sheet,
and not the instrument logbook.
Recommendation: Please provide EPA an AQS AMP 350 report showing that this
correction is made, once completed. EPA recommends an increase in the level of review
by the operator prior to data packages being turned over to data validation staff; this can
also be mitigated by more resources towards quality assurance.
4.4.6 Observation: KCDAQM does not follow up on the implementation of quality system
documents.
Recommendation: KCDAQM does not verify that staff have properly implemented
quality system documents. According to the agency completed TSA Questionnaire, “each
employee will receive an emailed link to a copy of the QAPP with a request for a read
receipt. The email will instruct employee to read.” Staff are not trained on the new or
updated procedure associated with the documents. EPA recommends, at a minimum, that
KCDAQM staff acknowledge receipt and compliance with the document via signature
once any quality system document has been finalized and distributed. This verification can
be assessed during the annual completion of the Employee Competency Check Sheet.
4.5 QUALITY ASSURANCE
SESD ID: 19-0036 Final Report Page 17 of 81
4.5.1 Finding: The agency was not operating under an approved QAPP for the 2016 - 2018
review time frame. Criteria pollutant data was being collected at SLAMS monitoring
stations without a current, approved QAPP in place.
Discussion: Monitors collecting data for regulatory decision-making purposes must have
a current, approved QAPP in place, pursuant to 40 CFR Part 58, Appendix A, § 2.1.2. The
regulation further states that QAPPs must contain SOPs, either attached or appropriately
referenced. In accordance with EPA Region 4 grant commitments, QAPPs should be
reviewed annually and revised every 5 years (minimum). The KCDAQM QAPP in effect
during the 2016 - 2018 calendar years (i.e., Quality Assurance Project Plan for the Knox
County, TN Ambient Air Quality Monitoring Program, October 2009) was approved by
EPA on November 5, 2010. The 2016 TSA report identified that the agency’s QAPP, at
that time, did not reflect the current NAAQS and was overdue for its required revision; its
associated SOPs were also outdated or missing (see Finding 4.5.1, SESD Project #16-
0408). As of this 2019 TSA report, the QAPP was finalized and approved on December
21, 2018. Resultantly, as part of the 2016 and 2017 data certification process, the AQS
AMP 600 Report (Certification Evaluation and Concurrence) generated by the agency
recommended an “N” flag for all KCDAQM monitors because the QAPP had not been
approved in > 5 years. (Note: An AQS “N” flag on an AMP 600 report means that the
certifying agency and/or the EPA has determined issues regarding the quality of the
ambient concentration data cannot be resolved.)
Recommendation: The agency must apply AQS “6” (i.e., QAPP Issue) qualifier flags to
its entire 2016 - 2018 data set to alert end data users of this quality system deficiency.
Please provide an AQS AMP 350 showing that the flags have been applied to 2016 thru
December 21, 2018 data as a deliverable for this corrective action.
Going forward, EPA recommends the QAPP, and its associated SOPs, be reviewed on an
annual basis, with the review documented to attest to its completion. The QAPP should be
revised whenever there are significant changes, such as regulatory modifications or
changes within the monitoring agency. Additionally, the QAPP must be revised within 5
years (minimum) of its approval date.
4.5.2 Concern: Documentation needs improvement due to insufficient detail to document
events and data decisions.
SESD ID: 19-0036 Final Report Page 18 of 81
Discussion: Several records were reviewed while visiting air monitoring stations and
performing in office TSA activities. The air monitoring station and instrument logbooks
for 2016-2018 were reviewed at the main office. During this review process, EPA auditors
identified instances where forms were not being utilized as intended. Blank spaces or
incomplete information were observed on data forms where required information,
measurements, or calculations were expected. Multiple QA/QC forms (i.e., “Stickies”, data
form stickers placed in instrument logbooks) were not filled out completely; specifically,
the percent difference calculations were often incomplete (e.g., Ozone T400 Analyzer, SN
4006, 2018 Logbook dated May 5, 2018). Prose-style comments by staff sometimes lacked
detail needed to recreate events or shed light on data quality concerns. Additionally, no
signatures or dates were observed showing review by data verifiers and validators was
conducted. All KCDAQM logbooks and QA/QC forms should contain more detail to
sufficiently narrate the events and clearly indicate the decision-making process regarding
data coding, data reduction and data handling. During the data review process, EPA
auditors, on several occasions, asked the staff about either an assigned AQS qualifier code
for data that did not meet regulatory requirements or data that was invalidated.
An example of insufficient documentation occurred regarding a data decision as a part of
a January 30, 2018 Corrective Action Report (Serial Number SN189) for ozone automated
nightly quality control checks using a Teledyne API 703E Calibrator (Serial #189). The
corrective action report described the investigation into a power surge and a faulty
photometer lamp on the calibrator, which contributed to invalid QC checks. During the
data validation process, KCDAQM deemed the ambient air data associated with the ozone
analyzer to be valid for the time nightly QC checks did not occur (i.e., March 21 – April
13, 2018), and decided to apply an AQS “6” (i.e., QAPP issue) qualifier flag to the dataset.
This is also documented on the corrective action report. However, EPA auditors reviewed
this data represented in the AMP 350 report and discovered that the reported data
contradicted the QA decision stated on the corrective action report. The AMP 350 report
shows that an AQS “V” qualifier flag (i.e., Validated value) was applied to the dataset.
KCDAQM staff recalled the decision to change the qualifier flag, though, documentation
could not be produced to support the application of the “V” qualifier flag.
Another example of incomplete documentation occurred on a June 1, 2018 Corrective
Action Report (Serial Number SN2025B226541005) detailed in Finding 4.1.2. An April
25, 2018 a failed leak check caused 12 filters from March 24 – April 24, 2018 to be
invalidated. The leak check coupled with filter exchange errors affected data completeness.
The corrective action report was well documented, however, the signature block for the
corrective action report was not complete, thus, signifying the quality assurance data
verification and validation process was not finalized.
SESD ID: 19-0036 Final Report Page 19 of 81
A third example involved a multi-point verification QC check for an ozone monitor, which
was recorded on a logbook “Stickie”. In the Ozone T400 Analyzer, SN 4006, 2018 Logbook,
dated May 5, 2018, the percent difference calculations were incomplete. The site operator
documented all other information and only circled “Pass” to indicate the multi-point
verification QC check met the acceptance criterion.
Recommendation: Data forms should be completely filled by operators and prose-style
comments augmented to contain more specific details, specifically regarding issues that
impact data validity. When documents are reviewed during data verification and validation,
each reviewer should sign and date the reviewed document or package, indicating that the
review was complete. Please provide EPA with a plan to improve documentation practices.
4.5.3 Concern: There is a lack of sufficient back up instrumentation in the KCDAQM air
monitoring network.
Discussion: In the KCDAQM air monitoring network, there are no functioning PM2.5
samplers to be deployed when a breakdown occurs to one of the five samplers operating at
the four air monitoring stations. As stated above, Concern 4.1.2 points to the necessity for
back up instrumentation (i.e., instrument age and failure, > 70% of the monitors being 14
years old and typical instrument lifespan is 5 - 7 years). The lack of back up equipment
presents a significant vulnerability to data collection in the event of malfunction in the
field. For example, the Thermo 2025 FRM sampler operating at Rule (47-093-1017)
regularly malfunctions (e.g., filter temperature excursions in July 22 – August 16, 2016;
filter exchange errors in August 5 and November 6 – 21, 2017; filter leaks in March 18 –
April 23, 2018). For the ozone network, there are new ozone monitors at East Knox and
Springhill, however, there are limited back up instruments—one uncertified calibrator and
one ozone analyzer. Spare PM2.5 samplers and ozone instrumentation are necessary to
decrease data collection vulnerabilities in the network. Further, there is a regulatory
requirement to have sufficient monitoring resources (i.e., instrumentation and spare parts),
per 40 CFR 58, Appendix A, § 2.1.3, which states “The PQAO/monitoring organization's
quality system must have adequate resources both in personnel and funding to plan,
implement, assess and report on the achievement of the requirements of this appendix and
it's approved QAPP.”
Several of the findings and concerns described in this TSA report contain narratives traced
to either malfunctioning or aging instrumentation (i.e., Finding/Concerns 4.1.2, 4.4.4, and
4.5.2).
Recommendation: Please provide EPA with a discussion of how the agency will procure
and maintain an adequate inventory of back up instrumentation and spare parts, specifically
SESD ID: 19-0036 Final Report Page 20 of 81
for instrumentation no longer supported by the manufacturer, within its air monitoring
program.
4.5.4 Concern: More resources devoted towards quality assurance are needed.
Discussion: Pursuant to 40 CFR Part 58, Appendix A, § 2.2, the monitoring organization
must provide for a quality assurance management function, which must have technical
expertise to conduct independent oversight of the agency’s air monitoring program.
Specifically, this Appendix A requirement states:
The quality assurance management function must have sufficient technical
expertise and management authority to conduct independent oversight and assure
the implementation of the organization's quality system relative to the ambient air
quality monitoring program and should be organizationally independent of
environmental data generation activities.
In addition, 40 CFR Part 58, Appendix A, § 2.1.3 states, “The monitoring organization's
quality system must have adequate resources both in personnel and funding to plan,
implement, assess and report on the achievement of the requirements of [Appendix A] and
its approved QAPP” [emphasis added].
Additionally, independent data validation and assessment is an essential component of
quality assurance oversight. Towards that end, for a monitoring network that operates
numerous monitoring sites, a tiered-data review approach with multiple members is needed
to ensure data verification/validation processes are effective and successful. A tiered-
approach typically starts with self- and peer-review verification activities (meaning, review
by the site operator followed by an independent, technical reviewer); followed afterwards
by independent validation, that culminates with various validation assessments. Activities
include not only assessing the ambient data directly but reviewing the supporting
documentation and records for completeness and accuracy. This structure (or similar) was
not always observed during the audit to effectively execute the QA component due to the
limited resources. Findings/Concerns 4.1.3, 4.2.7, 4.4.1 through 4.4.7, and 4.5.3 illustrate
where the lack of a multi-tiered data review approach is allowing data validation errors to
be reported to AQS.
KCDAQM operates an ambient air monitoring network with three staff members; there is
one vacant Environmental Specialist position. Considering the size of the agency, the
independence during the multi-tiered review, as described above, is challenging. As
currently constructed, an Environmental Specialist (or site operator) provides the first level
of data review; another Environmental Specialist, devoted solely to quality assurance,
SESD ID: 19-0036 Final Report Page 21 of 81
provides the remaining levels of data verification and validation; and finally, the Air
Monitoring Manager reviews the data decisions (i.e., quarterly tracing a random data point
for each pollutant from the dataset) to end the multi-tiered data review. The Environmental
Specialist (QA) has an important role and responsibility, while not only does this position
review logbooks, QC/QA forms, AQS submissions as part of data review and all QA
related functions; this position performs various other duties for the agency (e.g., QA
database management, training videos and quality system document development).
Further, due to the vacant position, the Environmental Specialist (QA) will now conduct
ozone and PM2.5 verifications until the vacant position is filled with a new hire. There are
signs pointing to the need for additional resources for QA, which are necessary to maintain
a robust quality system.
Recommendation: EPA recommends the agency augment its quality assurance activities
to include those described above, with additional efforts placed particularly on data
validation. Towards that end, an additional staff member(s) assigned QA responsibilities
would greatly benefit the agency and a more robust review of data decisions (i.e., a
statistical trace of approximately 10% or more data) by the Air Monitoring Manager. Please
provide EPA with a discussion of how the agency will increase quality assurance oversight
within its air monitoring program.
4.5.5 Observation: The follow up process for the corrective action report needs improvement.
Recommendation: KCDAQM should augment its corrective action process by
establishing time frames for when issues are to be reported and completed, as well as define
the chain-of-command for reporting corrective actions. This strategy should be included
in the agency’s annual QAPP revision and future Quality Bulletin, so all staff are aware of
the process.
5.0 Conclusions
KCDAQM is commended for continuing to develop a strong ambient air monitoring program.
KCDAQM staff demonstrated technical proficiency when interviewed regarding the
instrumentation as well as their roles and responsibilities. There have been noticeable steps taken
to continue to improve and enhance the air monitoring program (e.g., new ozone monitors, QA
database, training videos and updated sampling configuration for gaseous instruments).
During this TSA, the findings and concerns identified signal a need for new and back up
instrumentation, which would reduce maintenance and repair responsibilities and improve data
completion. Further, there is a need for more resources towards quality assurance (i.e.,
SESD ID: 19-0036 Final Report Page 22 of 81
reorganization of roles during the multi-tiered data review). Findings 4.1.1, 4.4.1 - 4.4.5, and 4.5.1
in this TSA report will require the invalidation or qualification of ambient concentration data
reported to the AQS database. Please notify EPA when all corrections have been made. Further,
any modification to data in AQS after it has been originally certified, pursuant to 40 CFR 58.15,
requires recertification of the data.
KCDAQM must develop a corrective action plan and timeline to address the findings and concerns
identified in Section 4 of this report and respond back to EPA within 30 days of receipt of the final
TSA report. Please note that the corrective actions do not have to be completed by this date, only
a plan to address the findings and concerns. Observations do not require a corrective action,
therefore, do not need to be addressed. If KCDAQM anticipates that the development of the
corrective action plan will not be completed within 30 days after the receipt of the final TSA report,
please contact EPA to request an extension.
SESD ID: 19-0036 Final Report Page 23 of 81
Appendix 1 KCDAQM Response-Technical Systems Audit Form
SESD ID: 19-0036 Final Report Page 24 of 81
APPENDIX A
United States
Environmental Protection Agency
Region 4
Science & Ecosystem Support Division
980 College Station Road
Athens, Georgia 30605
Ambient Air Monitoring
Technical Systems Audit Form
SESD ID: 19-0036 Final Report Page 25 of 81
Contents 1. General ............................................................................................................................................... 28
a. Program Organization ..................................................................................................................... 29
a.1 Organizational Chart ..................................................................................................................... 29
a.2 Key Position Staffing...................................................................................................................... 30
b. Facilities .............................................................................................................................................. 31
c. General Documentation Policies ......................................................................................................... 32
d. Training ............................................................................................................................................... 33
d.1 Training Plan ................................................................................................................................. 33
d.2 Training Events .............................................................................................................................. 34
e. Oversight of Contractors and Supplies ............................................................................................... 35
e.1 Contractors.................................................................................................................................... 35
e.2 Supplies ......................................................................................................................................... 35
2. Quality Management .......................................................................................................................... 37
a. Status of QA Program ...................................................................................................................... 37
a.1 QA and QC Activities ..................................................................................................................... 37
a.2 QC Acceptance Criteria ................................................................................................................. 38
b. Internal PE Audits ............................................................................................................................ 39
b.1 Internal Audit Questions ............................................................................................................... 39
b.2 Internal Audit Procedures ............................................................................................................. 39
b.3 Certification of Audit Standards .................................................................................................... 40
b.4 Audit Equipment ........................................................................................................................... 40
b.5 Audit Acceptance Criteria ............................................................................................................. 42
c. Planning Documents Including QMP, QAPP, & SOP ........................................................................ 42
c.1 QMP Questions ............................................................................................................................. 42
c.2 QAPP Questions............................................................................................................................. 43
c.3 SOP Questions ............................................................................................................................... 45
d. Corrective Action ............................................................................................................................. 46
e. Quality Improvement ...................................................................................................................... 49
f. External Performance Audits ........................................................................................................... 49
3. Network Management........................................................................................................................ 50
a. Network Design ............................................................................................................................... 50
b. Siting ................................................................................................................................................ 50
SESD ID: 19-0036 Final Report Page 26 of 81
b.1 Site Evaluations ............................................................................................................................. 50
b.2 Site Non-Conformance .................................................................................................................. 51
c. Waivers ............................................................................................................................................ 51
c.1 Waiver Questions .......................................................................................................................... 51
c.2 Waiver Types ................................................................................................................................. 52
d. Documentation ................................................................................................................................ 52
4. Field Operations .................................................................................................................................. 53
a. Field Support ................................................................................................................................... 53
b. Instrument Acceptance ................................................................................................................... 55
b.1 Instrumentation ............................................................................................................................ 55
b.2 Instrument Needs ......................................................................................................................... 55
c. Calibration ....................................................................................................................................... 56
c.1 Calibration Frequency and Methods ............................................................................................. 56
c.2 Calibration Questions .................................................................................................................... 56
d. Certification ..................................................................................................................................... 57
d.1 Flow Devices ................................................................................................................................. 57
d.2 Certification Questions ................................................................................................................. 57
d.3 Calibrator Certification .................................................................................................................. 60
e. Repair ............................................................................................................................................... 61
f. Record Keeping ................................................................................................................................ 62
5. Laboratory Operations ............................................................................................................................ 64
a. Routine Operation ........................................................................................................................... 64
a.1 Methods ........................................................................................................................................ 64
a.2 Quality System .............................................................................................................................. 65
b. Laboratory QC .................................................................................................................................. 66
b.1 Standards ...................................................................................................................................... 66
b.2 Laboratory Temperature and RH .................................................................................................. 67
c. Laboratory Preventive Maintenance ............................................................................................... 67
d. Laboratory Record Keeping ............................................................................................................. 68
e. Laboratory Data Acquisition and Handling ...................................................................................... 70
f. Filter Questions ............................................................................................................................... 72
g. Metals & Other Analyses ................................................................................................................. 73
g.1 Laboratory QA/QC ......................................................................................................................... 73
SESD ID: 19-0036 Final Report Page 27 of 81
g.2 Chemicals ...................................................................................................................................... 74
g.3 Pb................................................................................................................................................... 74
6. Data & Data Management .................................................................................................................. 75
a. Data Handling .................................................................................................................................. 75
b. Software Documentation ................................................................................................................ 77
c. Data Validation and Correction ....................................................................................................... 78
d. Data Processing ............................................................................................................................... 78
d.1 Reports .......................................................................................................................................... 78
d.2 Data Submission ............................................................................................................................ 79
e. Internal Reporting ........................................................................................................................... 80
e.1 Reports .......................................................................................................................................... 80
e.2 Responsibilities ............................................................................................................................. 81
SESD ID: 19-0036 Final Report Page 28 of 81
1. General
Note: As you answer the questions throughout this questionnaire, please keep in mind that answers to
some questions may be documented in your agency’s QMP, QAPP(s), SOP(s), and/or annual monitoring
network plan. As an alternative to providing language in the comment field for such questions, please
consider listing an appropriate reference to the document(s) – including document name and section
number – in which the relevant information has been documented. Such references should help reduce
the burden of completing this questionnaire through mitigating redundancy.
Knox County Department of Air Quality Management (KCDAQM)
Address:
140 Dameron Ave,
Knoxville, TN 37917
Date(s) of Technical Systems Audit: 1/14/2019
This section of the questionnaire completed by: Amber Talgo
Key Individuals (e.g., Agency Director, Ambient Air Monitoring Network Manager, QA Manager,
Technical Support/Instrument Repair Manager, etc.):
Title/Position Name
Director Lynne Liddington
Air Monitoring Manager Amber Talgo
Internal Auditor David Colvin
Environmental Specialist I (QA Officer) Rebecca Larocque
Environmental Specialist I (Field operator &repair)
Barron White
Environmental Specialist I (Field operator &repair)
Daniel Tipton
SESD ID: 19-0036 Final Report Page 29 of 81
a. Program Organization
a.1 Organizational Chart Upload an organizational chart, or attach to the form:
SESD ID: 19-0036 Final Report Page 30 of 81
a.2 Key Position Staffing
Enter the number of personnel available to each of the following program areas, and any vacancies, if
applicable.
Program Area Number of People
(Primary) Number of People
(Backup) Number of Vacancies
Network Management (site setup, siting, ANP, etc.)
2 0 0
Field Operations (QC checks, site visits, site maintenance, etc.)
2 2 0
Quality Management (audits, QA documentation, certifications, etc.)
2 1 0
Data and Data Management (data review, validation and acquisition system, AQS, etc.)
1 1 0
Technical Support (equipment repair and maintenance)
3 0 0
Internal Analytical Laboratory (if applicable) (PM2.5 gravimetric, high-volume PM10/Pb, toxics, etc.)
N/A N/A N/A
Comment on the need for additional personnel, if applicable.
Click or tap here to enter text.
SESD ID: 19-0036 Final Report Page 31 of 81
b. Facilities Identify the principal facilities where the agency conducts work related to air monitoring. Do not include
monitoring stations, but do include facilities where work is performed by contractors or other
organizations. * “Air Lab” is laboratory and office space located at 1403 Davanna St. Knoxville, 37917
Ambient Air Monitoring Function
Facility Location Comment on any significant changes to be
implemented within the next one to two years.
Instrument repair Air Lab Click or tap here to enter text.
Certification of Standards (e.g., gases, flow transfers, MFCs)
Chinook Engineering, Mesa Labs, EPA Region 4
SESD, Air Lab Click or tap here to enter text.
PM filter weighing IML Click or tap here to enter text.
Pb analysis ERG Click or tap here to enter text.
Data verification and processing
Air Lab Click or tap here to enter text.
General office space Air Lab Click or tap here to enter text.
General lab/work space
Air Lab Click or tap here to enter text.
Storage space (short and long term)
Air Lab Click or tap here to enter text.
Air Toxics (Carbonyls, VOCs, PAHs, Metals)
N/A Click or tap here to enter text.
Indicate below any facilities that should be upgraded or any needs for additional physical space
(laboratory, office, storage, monitoring stations, etc.).
Click or tap here to enter text.
SESD ID: 19-0036 Final Report Page 32 of 81
c. General Documentation Policies Complete the following table. If relevant information is provided in a QMP, QAPP, and/or SOP, please
provide an appropriate reference in the comment field in place of descriptive language.
Question Yes No Comment
Does the agency have a documented records’ management plan?
☒ ☐ See Appendix A of QAPP
• If yes, does this include electronic records? ☒ ☐ See Appendix A of QAPP
Does the agency have a list of files considered official records and their media type (i.e., paper and/or electronic)?
☒ ☐ Click or tap here to enter text.
Does the agency have a schedule for retention and disposition of records? Are records kept for at least three years? Comment on how long records are retained.
☒ ☐ See Appendix A of QAPP
Who is responsible for the storage and retrieval of records? If more than one person, please indicate those personnel responsible for storing/retrieving records, including what records each is responsible for.
Larocque: QA, Logbooks, field form, AirVision back up Talgo: RLs back up, employee records
What security measures are utilized to protect records?
Back up of electronic records on offsite server (including logbook scans) hardcopies are secured in Air Lab facility
Where/when does the agency rely on electronic files as primary records?
Data files directly from continuous instruments and electronically delivered lab reports
What is the system for storage, retrieval and backup of these files? See Sec. 19.1, 19.2, 19.3 &19.6 in QAPP
SESD ID: 19-0036 Final Report Page 33 of 81
d. Training
d.1 Training Plan Complete the following table.
Question Yes No Comment
Does the agency have a training plan? If yes, where is it documented?
☒ ☐ Click or tap here to enter text.
If yes, does the training plan include:
• Training requirements by position? ☒ ☐ Click or tap here to enter text.
• Frequency of training? ☐ ☒ Click or tap here to enter text.
• Training for contract personnel? ☐ ☒ Click or tap here to enter text.
• A list of core QA-related courses? Please attach a list of required courses or cite where such information may be found.
☒ ☐
New employee training includes at minimum: reviewing of quality documents (SOP’s, QAPP, Quality Bulletins, etc), reviewing of applicable local and federal regulations, and on the job mentoring with experienced staff. Depending on the tasks they perform, they may be required to participate in a 40-hour HAZWOPER training and Visible Emissions Evaluation training. Employees must familiarize themselves with the CFR requirements, QA procedures, SOPs and QAPP. Additionally, how-to videos for technical and QC/QC functions are available on the shared drive for the air monitoring staff.
• Does it make use of seminars, courses, EPA-sponsored college level courses, etc.?
☒ ☐
Methods employed in the collection and analysis of environmental samples and environmental data are subject to continual review and improvement. Continuing educational courses offered by vendors, EPA, APTI and Metro 4 are available as funding allows.
SESD ID: 19-0036 Final Report Page 34 of 81
Additionally, all employees are actively encouraged to pursue online training opportunities whenever possible. These courses and seminars may be provided as videotapes, closed circuit transmission, and/ or web based real-time interactive formats.
Are personnel cross-trained for other ambient air monitoring duties?
☒ ☐ Click or tap here to enter text.
Are training funds specifically designated in the annual budget?
☒ ☐ Click or tap here to enter text.
d.2 Training Events Indicate below the most recent training events, and identify the personnel who participated in them.
Event Date(s) Participant(s)
Distracted Driving 11/15/2018 Tipton, White & Larocque
LEP 11/21/2018 Tipton, White & Larocque, Talgo
HIPPA Awareness 10/1/2018 Tipton, White & Larocque, Talgo
SESD ID: 19-0036 Final Report Page 35 of 81
e. Oversight of Contractors and Supplies
e.1 Contractors Complete the following table. If your agency does not use contract personnel, proceed to section e.2
Supplies.
Contractors Yes No Comment
Who is responsible for oversight of contract personnel? The contract facility (IML &ERG)
Are contractors providing a service (e.g., independent performance audits, PM2.5 lab) audited? How often?
☒ ☐ Analysis of PM 2.5 filters and Lead filter analysis
What steps are taken to ensure contract personnel meet training and experience criteria?
Contract facilities are responsible for their own employee hiring practices. ERG is the National Contract Lab and IML are subject to EPA audits in their region.
Are contractor Quality Documents reviewed before procuring a service?
☒ ☐
In the past, quality documents were not reviewed. KCDAQM had not procured services (other than national contract services) since the last TSA in July of 2016. When services are placed out to bid, a review of the applicant’s quality documents will occur prior to award.
How often are contracts reviewed and/or renewed? Contracts are renewed annually. Rebidding of contracted services occur every 5 years
e.2 Supplies Complete the following table. If relevant information is provided in a QMP, QAPP, and/or SOP, please
provide an appropriate reference in the comment field in place of descriptive language.
Suppliers Yes No Comment
Have specifications been established for
consumable supplies and/or equipment? ☒ ☐ Section 17 of QAPP pg. 77
What supplies and equipment have established
specifications? Section 17 of QAPP pg.77
Is equipment from suppliers open for bid? ☐ ☐ Bids must be obtained on items or services
>$50,000. Procurements >$25,000 but <
$50,000 must obtain 3 written quotes (if
SESD ID: 19-0036 Final Report Page 36 of 81
not it must go out to bid). Procurements <
$25,000 are at the discretion of the
department head.
SESD ID: 19-0036 Final Report Page 37 of 81
2. Quality Management
This section of the questionnaire completed by: Amber Talgo
Key Individual(s):
Title/Position Name
Environmental Specialist I / QA Officer Rebecca Larocque
Air Monitoring Program Manager Amber Talgo
a. Status of QA Program
a.1 QA and QC Activities Complete the following table.
Question Yes No Comment
Does the agency perform all quality assurance (QA) activities with internal personnel (i.e., developing QMPs/QAPPs/SOPs and DQOs/MQOs, performing systems audits, assessments and performance evaluations, corrective actions, validating data, QA reporting, etc.)? If not, please indicate in the comment field who is responsible and which QA activities are performed.
☒ ☐ Click or tap here to enter text.
If the agency has contracts or similar agreements in place with either another agency or contractor to perform audits or calibrations, please name the organization and briefly describe the type of agreement.
TDEC performs semi-annual performance audits. KCDAQM operates under a Certificate of Exemption from the State of TN. No other official contract is in place. TDEC can stop these audits if and when they please.
Does the agency perform all quality control (QC) activities with internal personnel (i.e., zero/span/one-point QC checks, calibrations, flowrate, temperature, pressure and humidity checks, certifying/recertifying standards, lab and field blanks, data collection, balance checks, leak checks, etc.)? If not, please indicate in the comment field who is responsible and which QC activities are performed.
☐ ☒
Most standards are recertified by contract laboratories or the manufacturer.
SESD ID: 19-0036 Final Report Page 38 of 81
a.2 QC Acceptance Criteria Complete the following tables.
Question Yes/No Location Comment
Has the agency established and documented criteria to define agency-acceptable QC results?
Yes QAPPs Sec 7.1.3 -7.2
Pollutant
Does the agency adhere to the critical QC
acceptance criteria for criteria pollutants1 and
meteorological measurements2?
QC Acceptance Criteria
(if other than validation templates)
Action or Warning Limits
Corrective Action
O3 Yes Click or tap here to
enter text. Table 14.3 in
QAPP Table 14.3 in QAPP
PM10 Yes Click or tap here to enter text.
Table 14.1 in QAPP
14.1 in QAPP Table
PM2.5 Yes Click or tap here to enter text.
Table 14.2 in QAPP
Table 14.2 in QAPP
Pb Yes Click or tap here to enter text.
Table 14.4 in QAPP
Table 14.4 in QAPP
1 Appendix D Validation Templates of the QA Handbook for Air Pollution Measurement Systems Volume II 2 Appendix C Validation Templates of the QA Handbook for Air Pollution Measurement Systems Volume IV
SESD ID: 19-0036 Final Report Page 39 of 81
b. Internal PE Audits
b.1 Internal Audit Questions Complete the following table.
Question Yes No Response
Does the agency maintain a laboratory to support QA activities?
☐ ☒ Click or tap here to enter text.
Has the agency documented and implemented specific audit SOPs separate from monitoring SOPs?
☒ ☐ Click or tap here to enter text.
Are the QA personnel organizationally independent from the personnel responsible for generating environmental data (40 CFR Part 58, Appendix A, § 2.2)? If no, please explain in the comment field.
☒ ☐ Click or tap here to enter text.
Are annual performance evaluation (PE) audits conducted by technician(s) other than the routine site operator(s) (40 CFR Part 58, Appendix A, § 3.1.2)? If no, please explain in the comment field.
☐ ☒ Auditor and site operator are present.
Does the agency have identifiable auditing equipment and standards (specifically intended for sole use) for audits?
☒ ☐ Click or tap here to enter text.
Are audit equipment and standards ever used to support routine calibration and QC checks required for monitoring network operations? If yes, please explain in the comment field.
☐ ☒ Click or tap here to enter text.
b.2 Internal Audit Procedures If the agency includes performance audit procedures in pollutant-specific monitoring SOPs, please
provide an appropriate reference for each pollutant. Otherwise, if the agency does not have a
performance audit SOP, please describe the performance audit procedure for each type of pollutant.
Pollutant SOP/Performance Audit Procedure
Choose an item. Click or tap here to enter text.
SESD ID: 19-0036 Final Report Page 40 of 81
b.3 Certification of Audit Standards Attach a list or use the table below to provide information on the certification(s) of audit standards (e.g.,
flowmeters, gas standards, etc.) currently being used.
Complete the following table.
Question Yes No Comment
Does the agency have a separate certified source of zero air for performance audits?
☐ ☒
KCDAQM doesn’t have a “certified” source of zero air. A separate zero air system is used for audits
Does the agency have procedures for auditing and/or validating performance of meteorological monitoring?
☐ ☐ NA
b.4 Audit Equipment Use the table provided below to list the agency’s audit equipment and age of audit equipment (e.g., flow
standards, calibrators, zero air systems, etc.).
SESD ID: 19-0036 Final Report Page 41 of 81
Manufacturer Make and Model Number Purchase Year or Year Acquired
BGI HiVol Cal 2013
BGI Tetra Cal Pre-2006
BGI Tri Cal/Tetra Cal Pre-2006
Teledyne T703U 2017
Teledyne (partial year) T703 2009 probably
SESD ID: 19-0036 Final Report Page 42 of 81
b.5 Audit Acceptance Criteria Complete the following tables.
Question Yes/No Location Comment
Has the agency established and documented criteria to define agency acceptable audit results? If yes, comment where (page number, section, etc.)
yes
Audit SOP O3: Table 3.2 pg.12, PM 2.5:
Table 4.2 pg.13, PM 10: Table 5.2 pg. 18, Pb:
Sec 6.2 pg. 21, Speciation: Table 7.2
pg. 26.
Click or tap here to enter text.
Pollutant
Does the agency adhere to the audit acceptance
criteria for criteria pollutants3 and meteorological
measurements4?
PE Audit Acceptance
Criteria (if other than validation
templates)
Do the audit levels (gaseous PE audits only)
meet 40 CFR Part 58, Appendix A, § 3.1.2.1 criteria?
Corrective Action
PM10 no +/- 7% N/A Click or tap here
to enter text.
All others
yes Click or tap here to enter text.
Yes Click or tap here to enter text.
c. Planning Documents Including QMP, QAPP, & SOP
c.1 QMP Questions Complete the following table.
Question Response
Does the agency have an EPA-approved quality management plan (QMP)? Yes
• If yes, what is the approval date of the QMP? 9/20/2018
• If yes, has the QMP been approved by EPA within the last 5 years?
yes
• If yes, is the QMP multi-media or air-specific? Multi-media
• If yes, are changes to the plan needed that have not yet been approved by EPA?
No
3 Appendix D Validation Templates of the QA Handbook for Air Pollution Measurement Systems Volume II 4 Appendix C Validation Templates of the QA Handbook for Air Pollution Measurement Systems Volume IV
SESD ID: 19-0036 Final Report Page 43 of 81
c.2 QAPP Questions Complete the following table.
Question Response
Does the agency have an EPA-approved QA project plan (QAPP)? Yes
• If no, has the agency been delegated self-approval? Choose an item.
How often does the air monitoring agency review QAPPs? Are these reviews documented? If so, please provide a location.
Annually,
Does the agency have any QAPP revisions still pending EPA approval? yes
How does the agency verify that the QAPP is fully implemented?
1. Each employee will receive an emailed link to a copy of the QAPP with a request for a read receipt. The email will instruct employee to read.
2. Section 8.3 of QAPP 3. The quality system is
designed with checks and balances. So tasks are checked by someone who did not do them ect.
How are staff notified and trained when a QAPP is revised?
Each employee will receive an emailed link to a copy of the QAPP with a request for a read receipt. The email will instruct employee to read. Section 9.2.3 of QAPP Quality Burllentins
What personnel regularly receive updates? QA Officer, Air Monitoring Manager, staff effected by any changes
Does the agency have any missing QAPPs that need to be developed? no
• If yes, list any missing QAPPs. Click or tap here to enter text.
Provide a list of all QAPPs as an attachment or use the table below. If provided elsewhere, please
provide a reference.
QAPP Title Approval Date Pollutant(s) Status
QUALITY ASSURANCE PROJECT PLAN FOR THE KNOX COUNTY, TN AMBIENT AIR QUALITY MONITORING PROGRAM
11/5/2010 criteria Approved
SESD ID: 19-0036 Final Report Page 44 of 81
Quality Assurance Project Plan for Knox County Air Quality Management Ambient Air Monitoring Program
N/A criteria In Review
SESD ID: 19-0036 Final Report Page 45 of 81
c.3 SOP Questions Complete the following tables.
Question Response
Are all standard operating procedures (SOPs) complete, or are some in development?
Some in development
Does the agency have any missing SOPs that need to be developed? Yes
• If yes, list the SOPs that need to be developed. T640
Are SOPs available to all field operations personnel? Yes
Are SOPs for “episodic monitoring” prepared and available to field personnel? Refer to QA Handbook Volume II, Section 6.0.
Choose an item.
Are SOPs based on the framework contained in Guidance for Preparing Standard Operating Procedures (SOPs) (EPA QA/G-6)?
Choose an item.
Does the agency have SOPs specific to data handling and validation? No, it’s pending internal review
Who approves SOPs? EPA
How often are SOPs reviewed? Are these reviews documented? If so, please provide a location. How often are SOPs updated?
Annually, no documentation unless an amendment is needed
How are staff notified and trained when a SOP is revised? Receive Email that update is available, how to video if applicable
Provide a list of all SOPs as an attachment or use the table below. If provided elsewhere, please provide
a reference.
SOP Title Approval Date Pollutant(s) Status
Appendix C-F of QAPP, Quality Documents Spreadsheet Submitted
Click or tap to enter a date.
Click or tap here to enter text.
Choose an item.
SESD ID: 19-0036 Final Report Page 46 of 81
d. Corrective Action Complete the following table.
Question Response
Does the agency have an operational, documented, and comprehensive corrective action program in place?
Yes
• As a part of the QAPP? Yes
• As a separate document, or part of a SOP? Yes
Does the agency have established and documented corrective action limits for QA and QC activities?
Yes
Are corrective action procedures based on results of the following that have exceeded established limits?
Tables 14.1-14.4 of QAPP
• 1-Point QC checks Yes
• Calibrations and zero/span checks Yes
• Flow rate verifications Yes
• PEs (gaseous audits and semi-annual flow rate audits) Yes
• Precision goals (collocated PM2.5 and PM10) Yes
• Bias goals Yes
• NPAP audits No, prompt investigation
• PEP audits No, prompt investigation
• Completeness goals Yes
• Data audits Yes
• Technical Systems Audits Yes
How is responsibility for implementing corrective actions assigned? Sec 14.3 in QAPP
How does the agency follow up on implemented corrective actions? Quarterly QA Report, sometimes a quality bulletin if some change is necessary
Briefly describe at least two recent examples of the ways in which the above corrective action system was employed to remove problems.
SESD ID: 19-0036 Final Report Page 47 of 81
1.
SESD ID: 19-0036 Final Report Page 48 of 81
2.
SESD ID: 19-0036 Final Report Page 49 of 81
e. Quality Improvement Complete the following table.
Question Response
Have all deficiencies indicated in the previous TSA report been corrected? If no, please list and explain.
All except completed QA documents, but those have significant progress and all have been submitted to EPA at least once.
What actions were taken to improve the quality system since the last TSA?
QA database, Corrective Action Report, usable QAPP and QMP, all updated SOPs
Since the last TSA, do your control charts and/or AQS reports indicate that the overall data quality for each pollutant is steady or improving?
yes
What was/were the cause(s) when goals for measurement uncertainty per 40 CFR Part 58, Appendix A were not met (if applicable)?
Failing pump or MFC
What are your agency’s plans for quality improvement? Continue how-to videos, inventory spare parts
f. External Performance Audits Complete the following table.
Question Response Comment
Does your agency participate in the following external performance audits? If not, please explain why.
Click or tap here to enter text.
• NPAP Yes Click or tap here to enter text.
• PM2.5-PEP Yes Click or tap here to enter text.
• Pb-PEP Yes Click or tap here to enter text.
• Pb Strip Audit Yes @ ERG, we never see them
• Ambient Air Protocol Gas Verification Program (AA_PGVP)
N/A No Gases
• Round Robin metal PT N/A N/A
• NATTS/PAMS PT N/A Click or tap here to enter text.
List other performance audit participation. Click or tap here to enter text.
Who performs NPAP and PEP audits? EPA Contractors
SESD ID: 19-0036 Final Report Page 50 of 81
3. Network Management
This section of the questionnaire completed by: Amber Talgo
Key Individual(s):
Title/Position Name
Air Monitoring Manager Amber Talgo
Environmental Specialist I/ QA Officer Rebecca Larocque
a. Network Design For monitoring organizations and agencies that do not submit the annual network plan (ANP) required
by 40 CFR 58.10, please complete the table below. For those monitoring organizations that do submit an
ANP, proceed to section b. Siting.
Site Name AQS Site ID # Pollutant(s) Monitored
Proposed Changes
Click or tap here to enter text.
Click or tap here to enter text.
Click or tap here to enter text.
Click or tap here to enter text.
b. Siting
b.1 Site Evaluations Complete the following table.
Question Yes No Comment
How often are site evaluations for 40 CFR Part 58, Appendix E criteria conducted?
Frequency: annually
Date of last review: 10/12018
Where is this documented?
ANP, on Shared network drive
Are there any siting issues? ☐ ☒ Click or tap here to enter text.
Does the current level of monitoring effort (station placement, instrumentation, etc.) meet requirements imposed by current grant conditions?
☒ ☐ Click or tap here to enter text.
SESD ID: 19-0036 Final Report Page 51 of 81
b.2 Site Non-Conformance Please list any monitors with siting non-conformances, the AQS Site ID numbers for those monitors, the
type of non-conformance and the reason(s) for the non-conformance. If none of your agency’s monitors
have siting non-conformances, proceed to section c. Waivers.
Monitor AQS Site ID # Type of Non-Conformance Reason(s) for Non-
Conformance
Choose an item. Click or tap here to
enter text. Choose an item.
Click or tap here to enter text.
c. Waivers
c.1 Waiver Questions Complete the following table.
Question Yes No Comment
Does your agency have any waivers? ☐ ☒ Click or tap here to enter text.
Does your agency plan to request any waivers? ☐ ☒ Click or tap here to enter text.
Has your agency obtained necessary waiver provisions to operate equipment which does not meet the effective reference and equivalency requirements (if applicable)?
Click or tap here to enter text.
Do any sites vary from the required operating schedules in 40 CFR 58.12?
☐ ☒ Click or tap here to enter text.
Does the number of collocated monitoring stations meet the requirements of 40 CFR Part 58, Appendix A? If no, which pollutant(s)?
☒ ☐ Click or tap here to enter text.
SESD ID: 19-0036 Final Report Page 52 of 81
c.2 Waiver Types Indicate any waivers requested or granted by the EPA Regional Office, and provide waiver
documentation. If your agency does not have any waivers, proceed to section d. Documentation.
Waiver Type Reason
Choose an item. Click or tap here to enter text.
d. Documentation Complete the following table.
Question Yes No Comment
Are hard copy or electronic site information files retained by the agency for all air monitoring stations within the network? If so, please provide the location of these files in the comment field.
☒ ☐
Site logs are archived into the archive boxes after they are removed from use (annually). Most site logs stay in the field unless there is no suitable location for them (Pb). These are kept in the gear bags associated with the site. Also the ANP
Does each station have the required information, including:
• AQS Site ID Number? ☒ ☐ Logbook, Site evaluation form, ANP
• Photographs of the four cardinal compass points?
☒ ☐ Site assessment, ANP
• Startup and shutdown (if applicable) dates? ☒ ☐ ANP
• Documentation of instrumentation? ☒ ☐
Logbooks, QA database (what equipment is where)
Who has custody of the current network documents?
Name: Click or tap here to enter text.
Archived logs are located in the “Computer room” in the Air Lab. Current logs are at the sites themselves or on the shared drive
Title:Click or tap here to enter text.
SESD ID: 19-0036 Final Report Page 53 of 81
4. Field Operations
This section of the questionnaire completed by: Amber Talgo
Key Individual(s) (e.g., Field Manager, Field Supervisor, Field QA Manager, etc.):
Title/Position Name
Environmental Specialist I (Ozone and Cont. PM) Daniel Tipton
Environmental Specialist I (PM2.5 & Pb) Barron White
Environmental Specialist I QA Officer Rebecca Larocque
a. Field Support Complete the following table.
Question Yes No Comment
On average, how often are most of your stations visited by a field operator?
PM 2.5- weekly PM cont.- weekly Pb- 2X/ week O3- Weekly during season
Is this visit frequency consistent for all reporting organizations within your agency (if applicable)?
No Clue
On average, how many stations does a single operator have responsibility for?
5
How many of the stations of your SLAMS/NCORE network are equipped with sampling manifolds?
0
Do the sample inlets and manifolds meet the requirements for through-the-probe audits?
☒ ☐ Click or tap here to enter text.
• Briefly describe the most common manifold type and flow rate.
N/A
• How often are manifolds cleaned? N/A
• What is used to perform the cleaning? N/A
• Are manifolds equipped with a blower? N/A
• Is there sufficient air flow through the manifold at all times?
N/A
• How is the air flow through the manifold monitored? N/A
• Is there a conditioning period for the manifold cleaning?
N/A
• What is the residence time? N/A
• How often is the residence time calculated? N/A
Sampling lines: 1) What material is used for instrument sampling lines?
PTFE
2) How often are sampling lines changed or cleaned? ANNUALLY
Do you utilize uninterruptable power supplies or backup power sources at your sites?
☒ ☐
SESD ID: 19-0036 Final Report Page 54 of 81
What instruments or devices are protected? Data logger, Analyzer and Calibrator, TEOM and T640x
*Please attach an example of recent documentation of sample residence time calculation.
This is the calculation for the sites as run through 2018. The actual sample line length will go down in 2019.
Vtotal= Vsample line + Vfilter cylinder + Vinternal sample line
Sample line is 370 cm long and .3175cm internal diameter
Vsample line = 𝜋(0.3175
2)2 370 = 29.3cm3
Filter cylinder is 5.334 cm tall and has 4.064cm diameter
Vfilter cylinder = 𝜋(4.064
2)2 5.334 = 69.2cm3
Internal sample line is 247cm long and has .159cm internal diameter
Vinternal sample line = 𝜋(0.159
2)2 247 = 4.9cm3
𝑉𝑡𝑜𝑡𝑎𝑙 = 29.3 + 69.2 + 4.9 = 103.4𝑐𝑚3
Sample Residence Time (in seconds) = Vtotal / Pump flow (cc/min) x 60s/min
Ozone internal sample pump minimum flow 720cc/min - 880cc.min
Residence Time = 103 .4
720𝑥 60 = 8.62s Max =
103 .4
880𝑥 60 =7.05s min
SESD ID: 19-0036 Final Report Page 55 of 81
b. Instrument Acceptance
b.1 Instrumentation Please list the instruments in your inventory. *Listed only working equipment
Pollutant Number of
Instruments Make and Models
Reference or Equivalent
Number
O3 4 Ozone
Analyzer Teledyne / API 400E EQOA-0992-087
O3 4 Ozone Calibrator
Teledyne / API 703E Click or tap here to enter text.
O3 1 Ozone Calibrator
Teledyne / API 703U Click or tap here to enter text.
PM10 1 PM 10 Continuous
TEOM 1405 EQPM-1090-079
PM2.5 1 PM 2.5 Sequential
Thermo Partisol Plus 2025i
RFPS-0498-145
PM2.5 5 (2 in pieces) PM 2.5 Sequential
Thermo Partisol Plus 2025
RFPS-0498-145
Continuous PM2.5 mass
1 PM2.5/ PM10
T640 X EQPM-0516-238
PM2.5 speciation 1 Carbon Sampler
URG 3000N Click or tap here to enter text.
PM2.5 speciation 1 PM 2.5 Speciation
Met One Super SASS
Click or tap here to enter text.
Pb 6 TSP Hi-Vol General Metal
Works 40 CFR Part 50, Appendix B
PM10 5 PM 10 head inlet for Hi-Vol
Anderson/GMW RFPS-1287-065
Hi-Vol Orifice 7 Hi-Vol Orifice
Anderson/Graseby N/A
Zero air system/generator
3 Gist (pump for 0 air)
DOA-P704-AA
b.2 Instrument Needs Please list your instrument needs in order of priority.
PM 2.5 FEM or FRM: our current PM 2.5 FRMs are aging and failing. 5 are 13yrs, 1 is 7 yrs. & 1 is 2 yrs. ,
Ozone calibrators (T703U
SESD ID: 19-0036 Final Report Page 56 of 81
c. Calibration
c.1 Calibration Frequency and Methods Please indicate the frequency and method of multi-point calibrations of gaseous monitors.
Pollutant Frequency Calibration Method: Back of Instrument
Calibration Method: Through-the-Probe
O3 Start, Mid and end of season ☐ ☒
c.2 Calibration Questions Please complete the following table.
Question Yes No Comment
How are field calibration procedures documented, and how are the results recorded?
Click or tap here to enter text.
Are calibrations performed according to the guidance in Volume II of the QA Handbook?
☒ ☐ Click or tap here to enter text.
Are calibration procedures consistent with the operational requirements of Appendices to 40 CFR Part 50 or to analyzer operation/instruction manuals?
☒ ☐ If no, why not? Click or tap here to enter text.
Have changes been made to calibration methods based on manufacturer’s suggestions for a particular instrument?
☐ ☒ If yes, what change(s)? Click or tap here to enter text.
Do standards used for calibrations meet the requirements of appendices to 40 CFR Part 50 (EPA reference methods) and Appendix A to 40 CFR Part 58 (traceability of materials to NIST, SRMs or CRMs)?
☒ ☐ Comment on deviations. Click or tap here to enter text.
Are all flow-measurement devices NIST-traceable?
☒ ☐ Click or tap here to enter text.
SESD ID: 19-0036 Final Report Page 57 of 81
d. Certification
d.1 Flow Devices Please list the authoritative standards used for each type of flow measurement, and indicate the
certification frequency of standards to maintain field material/device credibility.
Flow Device Serial Number Primary
Standard Certification Frequency
Use (calibration, audit, or spare)
Streamline Pro M060401 Yes Annually Calibrations/Verifications
Streamline Pro SM060505 Yes Annually Calibrations/Verifications
Tetra cal 453 Yes Annually Audits
Tetra cal 187 Yes Annually Audits
Hi vol cal 96 Yes Annually Audit
Hi vol cal 95 Yes Annually Calibrations/Verifications
HiVol Orifice 3614 Yes Annually 4pt calibration Lead
Streamline Pro sm060501 Yes Annually Calibrations/Verifications
Defender 510 133398 Yes Annually Ozone
d.2 Certification Questions Please complete the following table.
Question Yes No Comment
How are certifications performed? (internally, by a vendor, or third party?)
Vender
Where do field operations personnel obtain gas standards? N/A
How are the gas standards verified after receipt? N/A
What equipment is used to perform calibrations (e.g., dilution devices)?
Transfer standard, Streamline Pro
Do the dilution air flow control and measurement devices conform to CFR requirements?
☐ ☐ N/A
What traceability is used? Click or tap here to enter text.
Is calibration equipment maintained at each station? ☐ ☒ Only O3
How is the functional integrity of this equipment documented?
Malfunctions are noted on cal form and lab manager is verbally notified and will remedy the issue if possible
Who has responsibility for maintaining field calibration standards? Air Monitoring Program Manager
*Please have copies of certifications of all standards currently in use from your master and/or satellite
certification logbooks (i.e., chemical, gas, flow, and zero air standards) available for review during the
on-site TSA.
*Please attach an example of recent documentation of traceability.
SESD ID: 19-0036 Final Report Page 58 of 81
SESD ID: 19-0036 Final Report Page 59 of 81
SESD ID: 19-0036 Final Report Page 60 of 81
d.3 Calibrator Certification Please list the authoritative standards and frequency of each type of dilution, permeation and ozone
calibrator, and indicate certification frequency.
Calibrator Primary Standard Frequency of
Certification/Calibration
O3 Level 2 Standard SRP SN#: 10 annually
O3 Level 3 Std. T703U KCDAQM level II 703e SN: 187 Semi-annually
SESD ID: 19-0036 Final Report Page 61 of 81
e. Repair Complete the following table.
Question Yes No Comment
Who is responsible for performing preventive maintenance? Field operators
Is special training provided to those personnel who perform preventive maintenance? Briefly comment on background or courses.
☐ ☒ OJT is provided as well as some How-to Videos
What is the preventive maintenance schedule for each type of field instrumentation? If this information is provided in agency SOPs, please indicate that in the Comment section.
Preventive maintenance is covered in individual SOPs which are App. C-F of KCDAQM QAPP
If preventive maintenance is MINOR, it is performed at: (check one or more)
☒Field Station ☐Headquarters Facilities ☐Manufacturer
Click or tap here to enter text.
If preventive maintenance is MAJOR, it is performed at: (check one or more)
☒Field Station ☒Headquarters Facilities ☒Manufacturer
Click or tap here to enter text.
Does the agency have service contracts or agreements in place with instrument manufacturers? Indicate in the Comment section or attach additional pages to show which instrumentation is covered.
☒ ☐
Teledyne will provide technical support for the life of the instrument per contracts on Ozone equipment and T640 and T640x
Comment briefly on the adequacy and availability of the supply of spare parts, tools, and manuals available to the field operator to perform any necessary maintenance activities. Do you feel that this is adequate to prevent any significant data loss?
☒ ☐
KCDAQM keeps common spare parts on hand. Each employee is issued a set of tools. Currently working on an inventory and data base of spare parts
Is the agency currently experiencing any recurring problem with equipment or manufacturer(s)? If so, please identify the equipment or manufacturer, and comment on steps taken to remedy the problem.
☒ ☐
Rule 2025 has frequent exchange errors and requires many make-up days. Eventual pump change. Rule tends to be a troublesome monitor.
SESD ID: 19-0036 Final Report Page 62 of 81
f. Record Keeping Complete the following table.
Question Yes No Comment
What type of station logbooks are maintained at each monitoring station? (e.g., maintenance logs, calibration logs, personal logs, etc.)
9.1.1 of QAPP
• If hard-bound logbooks are used, are they electronically scanned on any routine frequency? If yes, at what frequency?
☒ ☐ monthly
What information is included in the station logbooks? 9.1.1 of QAPP
Who reviews and verifies the logbooks for adequacy of station performance? Does the reviewer initial or sign the logbooks to document the review?
QAO, no signature
How is control of logbooks maintained? Sites and monitors with logs are locked. Logs that are kept in the gear bag are just left in the gear bag
Where is the completed logbook archived? 9.2 QAPP
What other records are used? (Use drop-down menu below). Comment on the use and storage of these documents.
Click or tap here to enter text.
Zero span record Used for data validation, hardcopy archived in the data room boxes
Maintenance log
Maintenance records are kept in the instrument logbooks and are used to track to usability of an instrument as well as validation
Log of precision checks
In instrument log and also hard copy filed and archived in data room. Data is uploaded to AQS and is used to aid in validation.
Control Charts
Instrument logs, minute data on AirVision, charts generated during Pb calculations (then printed and filed) all used in data validation and quality assurance/control
Are calibration records (or calibration constants) available to field operators?
☒ ☐ Click or tap here to enter text.
*Please attach an example field calibration record sheet.
SESD ID: 19-0036 Final Report Page 63 of 81
SESD ID: 19-0036 Final Report Page 64 of 81
5. Laboratory Operations
This section of the questionnaire completed by: Amber Talgo
Laboratory Name:
N/A
Laboratory Address:
Laboratory Address
Key Individual(s) (e.g., Laboratory Manager, Laboratory Supervisor, Laboratory QA Manager, etc.):
Title/Position Name
Click or tap here to enter text. Click or tap here to enter text.
a. Routine Operation
a.1 Methods In the table below, identify which of the following analyses are performed in the laboratory, and state
the method used to conduct the analyses.
Pollutant Method
Click or tap here to enter text.
Please describe areas where there have been difficulties meeting the regulatory requirements for any of
the above methods.
Click or tap here to enter text.
SESD ID: 19-0036 Final Report Page 65 of 81
a.2 Quality System Complete the following table.
Question Yes No Comment
Are procedures for the methods listed in Section a.1 included in the agency’s QAPP and/or SOPs?
☐ ☐ Click or tap here to enter text.
Have the laboratory SOPs been reviewed and approved by EPA?
☐ ☐ Click or tap here to enter text.
Are SOPs easily and readily accessible for use and reference within the laboratory? If not, where are the documents stored?
☐ ☐ Click or tap here to enter text.
Does the lab have sufficient instrumentation to conduct the analyses?
☐ ☐ Click or tap here to enter text.
Are separate facilities maintained for weighing the different sample types? (e.g., hi-volume vs low-volume), or is one weighing room utilized for all samples? Describe.
☐ ☐ Click or tap here to enter text.
Does your laboratory hold certifications? (EPA, NIST, State, NLAC, or other)
☐ ☐ Click or tap here to enter text.
Does your laboratory operate under a QA Manual or equivalent document?
☐ ☐ Click or tap here to enter text.
Does your laboratory participate in PE programs?
☐ ☐ Click or tap here to enter text.
Does your laboratory have a corrective action process for non-conforming work?
☐ ☐ Click or tap here to enter text.
Does your laboratory have a laboratory staff person assigned the role of QA Officer?
☐ ☐ Click or tap here to enter text.
Please describe needs for laboratory instrumentation.
Click or tap here to enter text.
SESD ID: 19-0036 Final Report Page 66 of 81
b. Laboratory QC
b.1 Standards Please identify the equipment and standards used in support of the gravimetric laboratory, including any
quality assurance standards (such as additional weight sets or portable RH/temperature probes).
Device Pollutant Brand (Make) Model (Class) Calibration/Certification
Expiration Date
Choose an item.
Choose an item.
Click or tap here to enter text.
Click or tap here to enter text.
Click or tap to enter a date.
*Please have calibration/certification records for all laboratory standards available for review during
the on-site TSA.
SESD ID: 19-0036 Final Report Page 67 of 81
b.2 Laboratory Temperature and RH Complete the following table.
Question Yes No Comment
What is the accuracy specification and recording time (e.g., 5 min. averaging time) of the temperature sensor (logger) used in the gravimetric laboratory?
Click or tap here to enter text.
What is the accuracy specification and recording time (e.g., 5 min. averaging time) of the relative humidity (RH) sensor (logger) used in the gravimetric laboratory?
Click or tap here to enter text.
What is the accuracy specification for any RH/temperature audit device used in the laboratory, if applicable?
Click or tap here to enter text.
Does the laboratory utilize an infrared (IR) gun to obtain sample shipment temperatures?
☐ ☐ Click or tap here to enter text.
• If yes, is the IR gun NIST-traceable? Provide the certification expiration date.
☐ ☐ Click or tap here to enter text.
• If no, what device is used to obtain shipment temperature? Please describe its traceability and provide a certification expiration date.
Click or tap here to enter text.
c. Laboratory Preventive Maintenance Complete the following table.
Question Yes No Comment
For laboratory equipment, who has the responsibility for performing preventive maintenance?
Click or tap here to enter text.
If equipment maintenance is performed by laboratory staff, does a SOP detail the procedures to be followed? Provide the SOP title, date, and revision number where the procedures are found.
☐ ☐ Click or tap here to enter text.
Is a maintenance log maintained for the balance? ☐ ☐ Click or tap here to enter text.
Are service contracts in place for the balance? ☐ ☐ Click or tap here to enter text.
If utilizing a weighing room, are service contracts in place for the climate control unit/HVAC?
☐ ☐ Click or tap here to enter text.
Describe static control equipment utilized in the weighing room, if applicable.
Click or tap here to enter text.
Does the weighing room undergo routine cleaning activities? On what frequency?
☐ ☐ Click or tap here to enter text.
Briefly describe the weighing room cleaning regime. Click or tap here to enter text.
SESD ID: 19-0036 Final Report Page 68 of 81
d. Laboratory Record Keeping Complete the following table.
Question Yes No Comment
Are all samples that are received by the laboratory logged in?
☐ ☐ Click or tap here to enter text.
Discuss sample routing (or reference the latest SOP which covers this). Attach a flow chart on the next page, if possible.
Click or tap here to enter text.
For the following four questions, select the medium used to document various activities enlisted. If the medium is not listed, select “Other” and list the medium. If the information is not recorded, select “N/A”.
• Environmental conditions, weighing session results, balance checks, and weight checks?
Choose an item.
• Serial numbers of filters prepared for the field? Choose an item.
• Serial numbers of filters returning from the field for analysis?
Choose an item.
• General information about daily lab activities, preventive maintenance procedures, and/or other significant events in the laboratory that may impact data quality or the data record?
Choose an item.
How are data records from the laboratory archived? Click or tap here to enter text.
• Where are these records archived? Click or tap here to enter text.
• Who has this responsibility? (identify person/position)
Click or tap here to enter text.
How long are these records kept? Indicate the number of months/years.
Click or tap here to enter text.
Does the laboratory SOP contain procedures for sample chain-of-custody (COC)?
☐ ☐ Click or tap here to enter text.
• If yes, indicate the title, date, and revision number, and where it can be found.
Click or tap here to enter text.
What type of COC record accompanies the samples? Click or tap here to enter text.
Does the laboratory maintain original COCs or copies?
☐ ☐ Click or tap here to enter text.
Where are COCs filed? Click or tap here to enter text.
SESD ID: 19-0036 Final Report Page 69 of 81
*If possible, attach a sample routing flow chart:
SESD ID: 19-0036 Final Report Page 70 of 81
e. Laboratory Data Acquisition and Handling Complete the following table.
Question Yes No Comment
Identify those laboratory instruments (e.g., balances, temperature/RH loggers, etc.) which make use of computer interfaces directly to record data.
Click or tap here to enter text.
Are QC data results readily available to the analyst during a weigh session?
☐ ☐ Click or tap here to enter text.
Do RH/temperature loggers record values using paper chart records (chart wheels)? If yes, where are the paper charts maintained? Are they signed and dated?
☐ ☐ Click or tap here to enter text.
What is the laboratory’s capability with regards to data recovery? In case of problems, can the laboratory recapture data that may be lost in the event of computer failure? Discuss briefly.
Click or tap here to enter text.
Does the laboratory maintain an SOP that discusses how to use the laboratory’s data acquisition instrumentation? If yes, please provide the SOP title, date, and revision number.
☐ ☐ Click or tap here to enter text.
SESD ID: 19-0036 Final Report Page 71 of 81
*Please attach a flow chart/diagram which illustrates the transcriptions, verifications, validations, and
reporting processes the data goes through before being released by the laboratory.
SESD ID: 19-0036 Final Report Page 72 of 81
f. Filter Questions Complete the following table.
Question Yes No Comment
Does the agency use filters supplied by EPA? ☒ ☐ Click or tap here to enter text.
• If no, do the filters utilized meet the specifications in 40 CFR Part 50? Who is the vendor? Be prepared to provide documentation to demonstrate acceptance testing results.
☐ ☐ Click or tap here to enter text.
Are unexposed filters equilibrated in a controlled conditioning environment which meets or exceeds the requirements of 40 CFR Part 50? Describe the conditioning room/chamber.
☐ ☐ Click or tap here to enter text.
How long is the conditioning period? Click or tap here to enter text.
Briefly describe how exposed filters are prepared for conditioning.
Click or tap here to enter text.
Briefly describe how and where exposed filters are stored after being weighed.
Click or tap here to enter text.
On what frequency are lab blanks utilized? Click or tap here to enter text.
Are chemical analyses performed on filters? If yes, which? Where are these additional analyses performed?
☐ ☐ Click or tap here to enter text.
SESD ID: 19-0036 Final Report Page 73 of 81
g. Metals & Other Analyses If your laboratory completes lead (Pb) and/or other metals analyses, please complete the tables in this
section.
g.1 Laboratory QA/QC
Question Yes No Comment
Are at least one duplicate, one blank, and one standard or spike included with a given analytical batch?
☐ ☐ Click or tap here to enter text.
Briefly describe the laboratory’s use of data derived from blank analyses.
Click or tap here to enter text.
Are criteria established to determine whether blank data are acceptable?
☐ ☐ Click or tap here to enter text.
How frequently and at what concentration ranges does the lab perform duplicate analyses? What constitutes an acceptable agreement?
Click or tap here to enter text.
Please describe how the lab uses data obtained from spiked samples, including the acceptance criteria (e.g., acceptable percent recovery).
Click or tap here to enter text.
Does the laboratory include samples of reference material within an analytical batch? If yes, indicate the frequency, level, and material used.
☐ ☐ Click or tap here to enter text.
Are mid-range standards included in analytical batches? If yes, describe the frequency, level, and compound.
☐ ☐ Click or tap here to enter text.
Are criteria for real-time QC established that are based on the results obtained for the mid-range standards discussed above? If yes, briefly discuss them below or indicate the document in which they can be found.
☐ ☐ Click or tap here to enter text.
Are appropriate acceptance criteria for each type of analysis documented?
☐ ☐ Click or tap here to enter text.
SESD ID: 19-0036 Final Report Page 74 of 81
g.2 Chemicals
Question Yes No Comment
Are all chemicals and solutions clearly marked with an indication of shelf life?
☐ ☐ Click or tap here to enter text.
Are chemicals removed and properly disposed of when the shelf life expires?
☐ ☐ Click or tap here to enter text.
Does the laboratory purchase standard solutions, such as those for use with Pb or other metals analyses?
☐ ☐ Click or tap here to enter text.
Are only ACS grade chemicals used by the laboratory?
☐ ☐ Click or tap here to enter text.
Comment on the traceability of chemicals used in the preparation of calibration standards.
Click or tap here to enter text.
g.3 Pb
Question Response Comments
Is Pb analysis performed by a contract laboratory? If yes, provide the laboratory name in the comment section.
Yes ERG
What filter media is used for Pb analysis? Glass fiber
Click or tap here to enter text.
Are filter samples visually inspected for defects (e.g., pinholes, tears and non-uniform deposit)?
Yes Pre-Sampling
Are filters invalidated if defects are found? If no, why not?
Choose an item.
Click or tap here to enter text.
Are tweezers used to handle filters? If yes, what material are the tweezers made of (e.g., Teflon, plastic, metal, etc.)?
Choose an item.
Click or tap here to enter text.
What extraction method is used for filters? Choose an item.
Click or tap here to enter text.
What reagents are used to clean glassware? Click or tap here to enter text.
List standards used for analysis. Click or tap here to enter text.
Are filter lot blanks analyzed for Pb content at a rate of 20 to 30 random filters per batch of 500 or greater? Only for filters not provided by EPA.
Choose an item.
Click or tap here to enter text.
How often are MDLs determined? Click or tap here to enter text.
How many replicates are used for MDLs? Click or tap here to enter text.
Are MDLs calculated in accordance with 40 CFR Part 136, Appendix B? If not, why not?
Choose an item.
Click or tap here to enter text.
Are waste HNO3, HCL, and solutions containing these reagents and/or Pb placed in labeled bottles and delivered to a commercial firm that specializes in removal of hazardous waste?
Choose an item.
Click or tap here to enter text.
SESD ID: 19-0036 Final Report Page 75 of 81
6. Data & Data Management
This section of the questionnaire completed by: Amber Talgo
Key Individual(s):
Title/Position Name
Rebecca Larocque QAO
Amber Talgo AMPM
a. Data Handling Complete the following table.
Question Yes No Comment
Is there a procedure, description, or a chart which shows a complete data sequence from point of acquisition to point of submission of data to EPA?
☒ ☐ Click or tap here to enter text.
Are procedures for data handling (e.g., data reduction, review, etc.) documented? If yes, comment on where.
☒ ☐ Data Handling SOP
In what media (e.g., flash drive, telemetry, wireless, etc.) and formats do data arrive at the data processing location?
PM 2.5:flashdrive O3, PM cont.: Internet Pb: Manually recorded field data & Email of lab report
How often are data received at the processing location from the field sites and laboratory?
O3 and PM cont.: hourly PM 2.5: weekly Pb: Weekly from field, monthly from lab
Are there any activities being done before data is released to agency internal data processing?
☐ ☒ Click or tap here to enter text.
How are data entered into the computer system? (e.g., computerized transcription, manual entry, digitization of strip charts, or other)?
O3 & PM cont.- computerized PM 2.5: file transfer from monitor to laptop to flash drive then to computer system. Pb: manual entry
For manual data, is a double-key entry system used?
☒ ☐ Click or tap here to enter text.
SESD ID: 19-0036 Final Report Page 76 of 81
*Please provide a data flow diagram indicating the data flow within the reporting organization.
SESD ID: 19-0036 Final Report Page 77 of 81
b. Software Documentation Complete the following table.
Question Yes No Comment
Does your agency use an AQS Manual? If yes, list the title of the manual used including the version number and date published.
☐ ☒ Click or tap here to enter text.
Does your agency use an AirNow Manual? If yes, list the title of the manual used including the version number and date published.
☐ ☒ Click or tap here to enter text.
Does the agency have information on the reporting of precision and accuracy data available?
☒ ☐ Click or tap here to enter text.
What software is used to prepare air monitoring data for release into the AQS and AirNow databases? Include the names of the software packages, vendor or author, revision numbers, and the revision dates of the software.
AirVision by Agilaire revision__3.6.119 build 2018.06.05.1
What is the recovery capability in the event of a significant computer problem (i.e., how much time and data would be lost)?
This is dependent on which computer(s) are affected.
Has your agency tested the data processing software to ensure its performance of the intended function are consistent with the QA Handbook Volume II, Section 14.0?
☐ ☒ Click or tap here to enter text.
Does your agency document software tests? If yes, provide the documentation.
☐ ☐ Click or tap here to enter text.
SESD ID: 19-0036 Final Report Page 78 of 81
c. Data Validation and Correction Complete the following table.
Question Yes No Comment
Is there documentation in regards to data that has been identified as suspect and subsequently flagged?
☒ ☐ Click or tap here to enter text.
Please describe what action the data validator will take (e.g., flags, invalidate, etc.) if they find data with exceeded QC criteria.
This is very broad and situation dependant.
Please describe how changes made to data that were submitted to AQS and AirNow are documented.
O3 & PM cont.- on data packet PM 2.5- on field tracking sheet Pb- on filter envelope
Who has signature authority for approving corrections?
Name:Amber Talgo or Rebecca Larocque Program Function:AMPM or QAO
What criteria are used to determine a data point be deleted or invalidated?
Data Handling SOP
What criteria are used to determine if data need to be reprocessed?
Weight of evidence
Are corrected data resubmitted to the issuing group/record generator for cross-checking prior to release?
☒ ☐ Pb only
d. Data Processing
d.1 Reports Complete the following table.
Question Yes No Comment
Does the agency generate data summary reports?
☒ ☐ Click or tap here to enter text.
Please list at least three reports routinely generated, including the information requested below.
Report Title Distribution Period Covered
X Quarter Audit Report AMPM, Director and
QAO Quarterly
Quarterly Quality Assurance Report AMPM & ANP Quarterly
Corrective Action Report AMPM, APTMD, DIrector As needed
SESD ID: 19-0036 Final Report Page 79 of 81
d.2 Data Submission Complete the following table.
Question Yes No Comment
How often are data submitted to AQS? Monthly
How often are data submitted to AirNow? Hourly
Briefly comment on difficulties the agency may have encountered in coding and submitting data following the AQS guidelines.
AQS is inconsistent in its behavior, but we manage to get it done
Does the agency retain a hard copy printout or an electronic copy of submitted data from AQS?
☒ ☐ Click or tap here to enter text.
Are records kept by the agency for at least three years in an orderly, accessible form? If yes, does this include:
☒ ☐ Click or tap here to enter text.
• Raw data ☒ ☐ Click or tap here to enter text.
• Calculations ☒ ☐ Click or tap here to enter text.
• QC data ☒ ☐ Click or tap here to enter text.
• Reports: list which reports are used ☒ ☐ All
Has your agency submitted data (along with the appropriate calibration equations used) to the processing center?
☐ ☒ Click or tap here to enter text.
Are concentrations of PM10 corrected to EPA standard temperature and pressure conditions (i.e., 298 K, 760 mm Hg) before input to AQS?
☒ ☐
Are concentrations of PM2.5 and Pb reported to AQS under actual (volumetric) conditions?
☐ ☒ Click or tap here to enter text.
Are audits on data reduction procedures performed on a routine basis? If yes, at what frequency?
☐ ☒ Click or tap here to enter text.
Are precision and accuracy data checked each time they are calculated, recorded, or transcribed to ensure that incorrect values are not submitted to EPA?
☒ ☐ Click or tap here to enter text.
SESD ID: 19-0036 Final Report Page 80 of 81
e. Internal Reporting
e.1 Reports What internal reports are prepared and submitted as a result of the audits required under 40 CFR Part
58, Appendix A?
Report Title Frequency
YEAR X Quarter Air Monitoring Audit Quarterly
What internal reports are prepared and submitted as a result of the precision checks required under 40
CFR Part 58, Appendix A?
Report Title Frequency
Ozone Monitoring Zero/Precision/Span Bi-Weekly
Question Yes No Comment
Do either the audit or precision check reports indicated include a discussion of corrective actions initiated based on audit or precision check results?
☐ ☒ That would be a separate Corrective Action Form
SESD ID: 19-0036 Final Report Page 81 of 81
e.2 Responsibilities Who has the responsibility for the calculation and preparation of data summaries? To whom are such
summaries delivered?
Name Title Type of Report Recipient
Rebecca Larocque Quality Assurance
Officer Quarterly QA
A. Talgo, L. Liddington
Identify the individuals within the agency responsible for reviewing and releasing the data.
Name Program Function
R. Larocque/A. Talgo QAO/AMPM
Question Yes No Comment
Does your agency report to the Air Quality Index (AQI)?
☒ ☐ Click or tap here to enter text.
Is data certification signed by a senior officer of your agency?
☒ ☐ Click or tap here to enter text.