preparing quality assurance project plans presented by: denise l. goddard, chemist quality assurance...

83
Preparing Quality Assurance Project Plans Presented By: Denise L. Goddard, Chemist Quality Assurance Section Athens, Georgia

Upload: sabina-cunningham

Post on 22-Dec-2015

223 views

Category:

Documents


1 download

TRANSCRIPT

Preparing Quality Assurance Project

PlansPresented By:

Denise L. Goddard, ChemistQuality Assurance Section

Athens, Georgia

EPA DISCLAIMER

This Presentation is for Training Purposes Only.

EPA - QA Documents for Preparing Quality Assurance

Project Plans Guidance on Systematic Planning Using the Data Quality Objectives Process, EPA QA/G-4, EPA/240/B-06/001 (February 2006)

Requirements for Quality Assurance Project Plans, EPA QA/R-5, EPA/240/B-01/003 (March 2001)

Guidance for Quality Assurance Project Plans, EPA QA/G-5, EPA/240/R-02/009 (December 2002)

Are QAPPs Really Required??? YES!!

Quality System Requirements – “Approved Quality Assurance Project Plans (QAPPs) or equivalent documents defined by your organization’s QMP, for all applicable projects and/or studies that will involve environmental data collection or where environmental decisions will be made for a particular site. QAPP must be approved prior to any data gathering work or activities, except under circumstances requiring immediate action (emergency response) to protect human health and the environment or operations conducted under police powers.”

Organizational Applicability??

EPA Organizations – Covered under Executive Order 5360.1, A2: “The Agency-wide Quality System requirements defined by this Order apply to all EPA organizations, and components thereof, in which the environmental programs conducted involve the scope of activities described in Section 5.a above. The authority of this Order applies only to EPA organizations except as addressed by Section 5.d(2) below.”

External Organizations Requirements

Extramural Agreements: Agency-wide Quality System requirements may also apply to non-EPA organizations. These requirements are defined in the applicable regulations governing extramural agreements. Agency-wide Quality System requirements may also be invoked as part of negotiated agreements such as memoranda of understanding (MOUs). Non-EPA organizations that may be subject to quality system requirements include:

Extramural Agreements

(a) Any organization or individual under direct contract to EPA to furnish services or items or perform work (i.e. contractor) under the authority of 48 CFR 46, (including applicable work assignments, delivery orders, and task orders);

40 CFR 31 – Grants & Cooperative Agreements with State & Local Governments

40 CFR 35 – State & Local Assistance

The Purpose of a Quality Assurance Project Plan

As a planning document, the QAPP should contain a detailed description of environmental data collection activities and operations, the problems associated with a site, the sampling and analysis requirements, the decisions to be made, and the necessary QA/QC activities governing this effort.

Issues Addressed by a QAPP

The QAPP must provide sufficient detail such as:

The project’s technical and quality objectives – these must be well defined and agreed upon by all affected parties and stakeholders

The program-specific and site-specific requirements (stipulated in consent decrees, records of decision, regulations, statutes, etc.).

The intended measurements, data generation or data acquisition methods that are appropriate for achieving project goals/objectives.

Issues Addressed by a QAPP – Con’t

A summary of the assessment procedures for confirming that data of the type, quantity and quality required and expected were obtained, and

A description of the process for evaluating the limitations on the use of the information or data obtained that includes identifying, documenting and communicating the limitations to all affected parties and stakeholders.

Overview of Content Requirements

To be effective, the QAPP must clearly state: The purpose of the environmental data operation

(e.g., enforcement, research and development, rulemaking),

The type of work to be done (e.g., pollutant monitoring, site characterization, risk characterization, bench level proof of concept experiments), and

The intended use of the results (e.g., compliance determination, selection of remedial technology, site closure, development of environmental regulations).

Before We Start - Some Preliminaries – Format/Content Requirements

Because the QAPP is a formal document – it should contain: A Title Page containing the title of the document, the

Identification of the Organization that Prepared the QAPP, the Preparation Date and the Version Number – The document should be Paginated

An Approval Page – Containing Signature and Date Blocks for each of the individuals/organizations responsible for approving this document.

Some Preliminaries – Format and Content Requirements A Distribution List – Containing the Names,

Mailing Addresses, Phone Numbers, and Email Addresses for each of the individuals and organizations requiring copies of the approved QAPP.

Table of Contents – For Text, Tables, Figures, Maps & Appendices. If there are numerous Tables, Figures & Maps – Place these items in the Appendix to reduce breakup of the text.

Some Cautionary Tips!!! Some Cautions:

Avoid using generic language that does not provide the required information or level of detail required.

For projects requiring the generation of chemical or biological data, make sure that you produce a list of contaminants of concern – or identify the biological parameters of interest.

Make sure the approved QAPP is distributed to project personnel, laboratory staff and if you are using CLP, identify the COCs in project log (unless there are numerous contaminants).

Let’s Start - Components of a QAPP

A QAPP is composed of approximately 25 elements that are grouped into four classes or categories as follows:

Class A – Project Management Class B – Measurement/Data Acquisition Class C – Assessment/Oversight Class D – Data Validation/Data Usability

Class A Topics - Overview

The elements in this group address: Project Management Project History/Site History Goals & Objectives of the Project Project Outputs

Class A Topics

The following topics must be addressed as part of the Class A components/elements: A1 – Title/Approval Page A2 – Table of Contents A3 – Distribution List A4 – Project/Task Description A5 – Problem Definition/Background Info A6 – Project/Task Description A7 – Quality Objectives & Criteria – DQOs/DQIs A8 - Documents & Records

A4 Project/Task Organization The following information is required:

Identify the individuals/organizations that will participate in the project/study – discuss their roles/responsibilities – identify the principal data users, decision makers, QA Manager, stakeholders and end data users.

QA Manager – should be iidentified in the QAPP – this individual should be independent of data collection operations, should have direct access to senior management, have overall authority over data collection activities when non-conformance with the QAPP is encountered.

An organizational chart depicting the lines of communication and authorities between senior management, the QAM and project personnel – should also include principal and end data users, decision makers, stakeholders, contractors & and any subcontractors.

Organizational Chart 1

Senior Management

LaboratoryAnalysisField Sampling

StaffData Validation

Organizational Chart 2

Senior Management

Laboratory Analysis

QA Manager

Field SamplingStaff

Data ValidationData QualityAssessment

Organizational Chart 3

Senior Management

Laboratory Analysis

QA Manager

Field SamplingStaff

Data ValidationData Quality Assessment

Organizational Chart 4

Senior Management QA Manager

ProjectManagement

Laboratory AnalysisField Sampling

Staff

Data Validation & Data Quality Assessment

Organizational Chart 5

SeniorManagement

QA Manager

Project Manager Decision Makers

Laboratory Analysis

OrganicAnalysis

InorganicAnalysis

Field SamplingStaff

Data Validation &Data Quality Assessment

Joe Smo Jane DoeJohn WU

DQALinda Good

D. Val.

A5 Problem Definition/Background

Summarize the problem to be solved The decision to be made Or outcome to be achieved Include background/historical information Include scientific and regulatory

perspectives

A6 Project/Task Description

Summarize all work to be done Specify all measurements that must be taken –

identify which measurements are critical or non-critical - critical measurements will be used to make site decisions – non-critical measurements won’t be used during the decision making process

Provide a list of all of the equipment required Identify any products that will be produced Provide Maps, Charts, Figures & Tables

A7 Quality Objectives & Criteria

Describe the quality goals/objectives for the project – provide the performance criteria for achieving these goals/objectives, etc.

Provide the project-specific data quality objectives (both qualitative and quantitative) and the specific data quality indicators (precision, bias, sensitivity, comparability, completeness and representativeness) relevant to the project/study.

Brief Overview of the Systematic Planning Process

Data Quality Objectives Process: Step 1 – State the Problem Step 2 – Identify the Goals of the Study Step 3 – Identify the information inputs Step 4 – Define the Boundaries of the Study Step 5 – Develop the Analytical Approach Step 6 – Specify Performance or Acceptance

Criteria Step 7 – Develop the Plan for Obtaining Data

Additional Thoughts on the DQO Process

Include any and all assumptions concerning site contamination, contaminant pathways, remedial techniques, clean-up design, monitoring strategies, etc., as part of the DQO process.

Identify any suspected potential departures from assumptions in support of the DQO process.

A8 Special Training/Certifications

Identify and describe any specialized training (including QA training) needed by project personnel required to successfully complete the project or task.

Discuss how such training will be provided – discuss who is responsible for obtaining internal training for staff.

Discuss where training documentation will be maintained.

Specify whether professional certifications, accreditations or licenses are required for staff to perform their designated tasks/duties.

A9 Documents & Records

Describe the process and responsibilities for ensuring the appropriate project personnel have the most current approved version of the QAPP, including version control, updates, distribution and disposition.

Itemize the information required in project documents, records and reports. The type of information required for analytical data reports must be specified for both hard-copy and electronic formats. Data deliverables can and do include raw data, data from other sources such as computer databases, literature searches, field logs, sample preparation logs, analysis logs, instrument printouts, model inputs and outputs files, and the results of calibrations and QA checks.

A9 Documents & Records

Specify whether status/progress reports and final reports are required.

Specify or reference all applicable requirements for the final disposition of records/documents, including location and retention time.

Identify the individuals who are responsible for preparing project documents, records and reports – also identify who within EPA will receive this information.

Class B Topics - Overview

Discuss all aspects of data collection and generation

Describe sampling design and provide rationale for your approach

Specify the analytical measurements both field and fixed laboratory

Describe sample handling and chain-of-custody requirements

Specify QA/QC samples with acceptance criteria

Class B Topics

B1 – Sampling Process Design B2 – Sampling Methods B3 – Sample Handling & Custody B4 – Analytical Methods B5 – Quality Control B6 – Instrument/Equipment Testing, Inspection & Maintenance B7 – Instrument/Equipment Calibration & Frequency B8 - Inspection/Acceptance of Supplies & Consumables B9 – Non-Direct Measurements B10 – Data Management

B1 Sampling Process Design & Experimental Design

Describe the experimental data generation or data collection design for the project, including as appropriate: The types & numbers of samples required The design of the sampling network The sampling locations, frequency of collection

at each location and sample matrices The measurement parameters of interest, and The rationale for the sampling design chosen.

Sampling Designs Should be Consistent with your Conceptual Models!! Evaluate your underlying assumptions -

whether they are conscious or unconscious Use a statistical tool or sampling tool such

as Visual Sample Plan to test your sampling design.

Use historical data if available to determine the actual distribution of contaminants.

B1 Sampling Designs

Directed Sampling Designs Judgmental Sampling

Probability Sampling Designs Simple Random Systematic/Grid Stratified Composite Adaptive Collaborative ( Double) Hot Spot

Judgmental Sampling Design - Pros

Judgmental sampling is the subjective selection of sampling locations in space & time by an individual analyst or expert.

Consistent with intuitive feeling Easy to direct, easy to do May be cost effective if the conceptual site

model for the project is correct Great if you know absolutely everything there is

to know about the site and your conceptual site model is absolutely correct.

Judgmental Sampling Design - Cons

Inference from sample to population questionable Use of incorrect conceptual model can lead to incorrect

decisions – can be a disaster. Not suitable for estimating underlying population

parameters (e.g., mean) with specified confidence – Cannot use statistics to evaluate distribution of data with any degree of confidence – with this sampling design this is no underlying assumption that the data are normally distributed.

Not suitable for testing hypothesis about underlying populations with specified decision error rates

Simple Random Sampling - Pros

Simple in concept and provides proper (theoretical support) data for statistical data analysis – representative sampling locations are chosen using the theory of random chance probabilities Protects against bias in estimating parameters

(e.g., means) and testing hypothesis Is the basic building block of more complicated

(and efficient) sampling designs.

Simple Random Sampling - Cons Ignores available information that could be

used to develop more cost-effective sampling designs Not as effective as other designs for delineating

patterns of contamination or finding hot spots Difficult to find randomly selected sampling

locations Tends to demand large numbers of samples

Systematic (Grid) Sampling

Systematic (grid) sampling consists of collecting samples according to a specified pattern at regular intervals in space or time within a grid pattern: Square or rectangular grid patterns over space Equal-interval sampling along a straight line

Systematic Sampling - Pros

Easy to explain and implement and provides uniform coverage of site or project

Good for estimating boundaries, trends, or patterns of contamination over space or time.

May yield more precise estimates of population parameters than other sampling designs

Required for statistical data analysis to estimate trends and spatial patterns

Systematic Sampling - Cons

Systematic sampling can cause estimated means to be biased if the sampling grid pattern lines up with any pattern of contamination.

More information is needed (than for simple random sampling) about the population to estimate the variance of the estimated mean.

Stratified Sampling

The target population is divided meaningfully into contiguous sub-populations called strata

Sampling locations are selected independently within each strata using some sampling design

Stratified Sampling - Pros

Dramatically reduces the variability present in the population and hence improves precision

Enables estimates of individual areas to be made

Assists in providing good coverage of the project

Allows for increased samples from policy or project sensitive areas

Stratified Sampling - Cons

Requires advanced knowledge in order to divide the study area into roughly homogeneous strata before sampling

The number of samples to be taken in each stratum must be determined

If strata boundaries are inaccurate, what appears to be outlier data can appear due to being in the wrong strata

Composite Sampling

Many individual (grab) samples are combined and thoroughly mixed to make a homogeneous whole.

At random, sub-samples (composite samples) are made and sent to the laboratory for analysis.

The physical size of composite samples are the same size as those obtained at random.

Composite Sampling - Pros

Allows for estimating the mean concentration with the same precision at a lower cost

Provides better coverage of the study site without increasing the number of chemical analyses

Allows for a more representative sample from a basic area of sample support (sampling unit).

Can be used in combination with other sampling designs.

Composite Sampling - Cons

Information on individual samples used to form composite samples is lost in compositing

Potential for loss of contaminants (volatiles) during the mixing and handling phase

Potential for reactions and interactions among analytes during compositing

Need to make decision on how many grab samples to be composited and how many composite samples to send for analysis

WHY IS YOUR SAMPLING DESIGN IMPORTANT!!

UNCERTAINTY!!! UNCERTAINTY!!! UNCERTAINTY!!!

Due to the Variability Between Analytical Results Within a Given Data Set???

OR

Due to Sampling Issues???

And The Correct Answer Is

BOTH!!!

ANYTHING ELSE??? You bet, in addition to how you collected the samples is the important issue of WHERE you collected your samples and this relates back to your sampling design and the assumptions you made concerning site conditions which in turn directed the development of your conceptual site model – these issues could have greatly increased your uncertainty and may lead to a wrong decision. A wrong sampling design and a flawed conceptual site model will lead to DECISION ERROR.

Bottom Line!!

‘It is understandable that analytical studies, with their sophisticated instrumentation and high cost, are often perceived as the dominant element in a site characterization project/study. Yet, despite that sophistication and high cost, analytical data generated under a scientifically defective or unsound sampling design will have limited utility.’

The Best Result

Data Set Distribution – Normality

Normal Distributions & the Central Limit Theorem

The normal distribution is one which appears in a variety of statistical applications. One reason for this is the central limit theorem. This theorem tells us that sums of random variables are approximately normally distributed if the number of observations is large. For example, if we toss a coin, the total number of heads approaches normality if we toss the coin a lot of times. Even when a distribution may not be exactly normal, it may still be convenient to assume that a normal distribution is a good approximation. In this case, many statistical procedures, such as the t-test can still be used.

Ranked Set Sampling – A Combination of Statistics &

Expert Judgment A sampling design where expert judgment is

used in combination with simple random sampling

Simple random sampling is used to create a large number of potential samples. The expert then ranks these potential samples and selects which to send for analysis.

Ranked Set Sampling – Pros & Cons

Pros: Better representativeness through using experts Better precision than random sampling Same simple formulae to use

Cons: Increased cost of the expert ranking samples Difficult quantifying exact improvement Need to find best variable to do the ranking on …..but the pros definitely outweigh the cons

B2 Sampling Methods

Describe the procedures for collecting samples – provide SOPs Specify sampling methods and equipment Provide sample container, volume, preservation, and holding

time requirements Describe the decontamination procedures Provide a list of sampling equipment Describe performance requirements for sampling methods Identify the location of support facilities Identify the individuals who are responsible for implementing

corrective actions during field sampling activities

B3 Sample Handling & Custody

Describe the requirements for sample handling & custody in the field, laboratory, and during transport, taking into your holding time requirements.

Include sample handling requirements for packaging, transporting and storing the collected samples.

Provide examples of sample labels, custody forms, sample custody logs and custody seal.

B4 Analytical Methods

Identify the analytical methods, instruments, equipment required.

Discuss how laboratory staff are to sub-sample the collected environmental sample.

Identify the contaminants of concern and specify the extraction, digestion and analytical method for each contaminant

Specify the laboratory decontamination and waste disposal procedures

Identify the individuals who are responsible for implementing corrective actions when problems are encountered during extraction, digestion or analysis of the samples.

Specify the detection limit requirements for each contaminant. Provide the regulatory standard(s) (action limits, ARARs,

MCLs, water quality standards, etc.).

B5 Quality Control

Identify QC activities needed for each sampling, analysis, or measurement technique. For each required QC activity, list the associated method or procedure, acceptance criteria, and corrective action.

B5 Quality Control Samples

Specify the type and frequency of quality control sample collection or QC activity: Blanks Spikes (MS/MSDs) Duplicates Standard Reference Materials Rinsates/Equipment Blanks Second Column Confirmation

B5 Quality Control Samples

Specify the acceptance criteria for spike recoveries and the precision requirements.

Specify the frequency of QC sample collection and analysis.

B6 Testing, Inspection & Maintenance

Identify the instruments/equipment requiring testing, inspection & maintenance during data collection operations (both field and fixed laboratory).

Provide the testing, inspection and maintenance procedures & identify the individuals who are responsible for these tasks.

Specify the frequency of instrument & equipment testing, inspection & maintenance.

Discuss the corrective actions necessary when instruments & equipment no longer function as required.

Identify the location of spare parts for repairing items.

B7 Calibration & Frequency

Identify all tools, gauges, instruments and other sampling, measuring and test equipment used for data generation or collection activities affecting the quality that must be controlled and, at specific periods, calibrated to maintain performance within specified limits.

B7 Calibration & Frequency

Identify the instruments/equipment requiring calibration.

Describe the calibration procedures and identify the standards used during calibration.

Specify the frequency of calibration and specify the acceptance criteria for calibrations (for all instruments/equipment).

Identify the individuals who are responsible for calibrating instruments/equipment.

Identify the individuals who are responsible for performing calibrations.

B8 Supplies & Consumables

Identify the supplies & consumables that are used during field data collection operations

Supplies & consumables would include calibration solutions/standards, calibration gases, reagents, tubing & hoses, de-ionized water, potable water, electronic storage media (data loggers), etc.

Specify the acceptance and rejection criteria for each item.

Identify the individuals who will inspect supplies & consumables to ensure that they meet the relevant acceptance criteria.

B9 Non-Direct Measurements

Identify any types of data needed for project implementation or decision making that are obtained from non-measurement sources such as computer data bases, programs, literature searches, surveying data and historical data/data-bases, modeling, etc.

Describe the intended use of this data, and define the acceptance criteria for the use of such data in the project and specify any limitations and restrictions in the use of the data.

B10 Data Management

Describe the project data management process, tracing the path of the data from their generation to their final use or storage (e.g., the field, the office and/or the laboratory).

Describe or reference the standard record-keeping procedures, document control system, and the approach used for data storage and retrieval on electronic media.

Discuss the control mechanism for detecting and correcting errors and for preventing loss of data during data reduction, data reporting, and data entry to forms, reports and databases. Provide examples of any forms or checklists to be used.

B10 Data Management

Identify and describe all data handling equipment and procedures used to: Process Compile And analyze data Including computer hardware Computer software Software configurations Include secondary data sources

B10 Data Management

Describe the procedures that will be followed to demonstrate acceptability of the hardware and software configuration required, and describe the process for assuring that applicable information resource management requirements are satisfied.

Discuss how your organization will comply with EPA data management requirements as specified in EPA Order 2180.1 or newly issued data standards.

Class C Topics - Overview

The topics in this group address the activities for assessing the effectiveness of project implementation and associated QA/QC activities. The purpose of assessment is to ensure that the QAPP is implemented as prescribed.

Class C Topics

C1 – Assessment & Response Actions C2 – Reports to Management

C1 Assessments & Response Actions

Describe each assessment to be used in the project including the frequency and type.

Assessments include, but are not limited to, surveillance, management systems reviews, readiness reviews, technical systems audits, performance evaluations, audits of data quality and data quality assessments.

Discuss the information expected and the success criteria (i.e., goals, performance objectives, acceptance criteria specifications, etc.).

C1 Assessments & Response Actions

List the approximate schedule of assessment activities. For any planned self assessments (utilizing personnel from

within the project groups) identify potential participants and their exact relationship within the project organization.

For independent assessments, identify the organization and the person(s) that shall perform the assessments if this information is available.

Describe how and to whom the results of each assessment shall be reported.

Discuss how corrective actions will be implemented, documented, tracked and verified for closure.

C2 Reports to Management

Identify the frequency and distribution of reports issued to inform management (EPA or otherwise) of the project status, or to inform them of the results of performance evaluations and systems audits, data quality assessments, and significant data quality issues.

Identify the preparer and the recipients of the reports and any specific actions recipients are expected to take as a result of the reports.

Class D Topics - Overview

The topics in this group address the QA activities that occur after the data collection phase of the project is completed. Implementation of these elements determines whether or not the data conform to the specified criteria, thus satisfying the project objectives.

Class D Topics

D1 – Data Review, Verification & Validation D2 – Verification & Validation Methods D3 – Reconciliation with User Requirements

D1 Data Review, Verification & Validation

Specify the criteria used to review and validate the data – that is provide the acceptance and rejection criteria by which the data will be assessed to determine the quality of this information.

Provide a list of the data qualifier flags or qualifiers along with their respective definitions.

D2 Verification & Validation Methods

Describe the process to be used for verifying and validating the data, including the chain-of-custody for data throughout the life of the project or task.

Discuss how issues shall be resolved and the authorities for resolving such issues within the organization.

Describe how the results of data verification & validation are conveyed to end data users, decision makers and stakeholder.

Precisely define and interpret how validation issues differ from verification issues for this project.

Provide examples of any forms or checklists to be used; and, identify any project-specific calculation required.

D2 Reconciliation with User Requirements

Describe how the results obtained from the project or task will be reconciled with the requirements defined by the data user or decision maker.

Outline the proposed methods to evaluate the data and determine those possible anomalies or departures from assumptions that were established in the planning phase of data collection.

Describe how reconciliation with user requirements will be documented, issues will be resolved, and how limitations on the use of the data will be documented, communicated and reported to decision makers and stakeholders.

Reference Page & Appendices

Reference Page: Contains a list of the references cited in the QAPP.

Appendices: Contains any relevant materials and documents that will support the QAPP.

QAS Contacts

Marilyn Maycock, Chief (706) 355-8553 [email protected]

Denise Goddard, Chemist (706) 355-8568 [email protected]

QAS Contacts

Charlie Appleby, Chemist (706) 355-8555 [email protected]

Ray Terhune, Chemist (706) 355-8557 [email protected]