cancer registry data quality and analysis

Download Cancer Registry Data Quality and Analysis

If you can't read please download the document

Upload: indonesian-journal-of-cancer

Post on 06-Apr-2015

353 views

Category:

Documents


3 download

DESCRIPTION

Cancer Registry Data Quality and Analysis - Hai-Rim Shin - Data Analysis and Interpretation Group, International Agency for Research on Cancer, Lyon, France

TRANSCRIPT

Cancer Registry Data Quality and AnalysisHai-Rim ShinData Analysis and Interpretation Group International Agency for Research on Cancer Lyon, France

9 November 2009 Indonesia

http://www-dep.iarc.fr/

Coding and Classification system: ICD-O3

DEP Edits Crg tools: CHECK Program

Multiple Primary Rules

Cancer registryYes

IARC (DEP)ICD-O-3 data

No 1. Preliminary conversion into ICD-O-3 Error (1) 2. Edit checks (DEPedits) Yes Yes/No

Data cleaning team Mathieu Mazuir Eric Masuyer Mohssen Issa

Error (1+2) Yes/No

Data processing team Morten Ervik Jacques Ferlay Mary Heanue

3. Multiple primary check

4. Conversion from ICD-O-3 to ICD-10

5. Check mortality and population files 6. Update process form, MS access and Excel sheet 7. Convert the incidence, mortality and population raw data files into a common file format Tables OK Yes 8. Produce editorial tables/Update Excel sheet

No (missing information)

editors (FTP site)

Quality : Quality ControlQuality of DataThe registry data reliable and of good quality Should be complete, consistent and accurate

Quality ControlThe mechanism by which the quality of data can be assessed * a formal ongoing programme * ad hoc survey to assess completeness and consistency of case finding, abstracting, and coding as well as the accuracy of reporting

References Skeet RG. Quality and quality control. In: Jensen OM, Parkin DM, Malclennan R, Muir CS, Skeet RG, editors. Cancer registration . Principals and methods (IARC Scientific publications no. 95). Lyon: IARC Scientific Publications;1991:p 101-7 Parkin DM, Chen VW, Ferlay J, Galceran J, Storm HH, Whelan SL, editors. Comparability and quality control in cancer registration (IARC technical report no. 19). Lyon: IARC (WHO) and IACR; 1994

Quality of information1. Completeness of cover 2. Completeness of detail 3. Accuracy of detail 4. Accuracy of reporting 5. Accuracy of interpretation

Quality of information1. Completeness of cover every cancer cases, no duplicates 2. Completeness of detail essential items: diagnosis, sex, non-essential items: not recoded not applicable, and not known 3. Accuracy of detail errors of detail: abstraction, transcription, coding 4. Accuracy of reporting 5. Accuracy of interpretation to properly interpret the information, it is essential to have an understanding of the data sources and how the data are collected and processed

Quality Control1. Assessment of completeness Objective measures of completeness Completeness and accuracy of detail 2. Continuous or ad hoc Quality Control 3. Computer checks for data quality

Assessment of Completeness The population-based registry aims to record all cancer cases occurring within its defined geographical area Constantly monitored, rather than occasionally measured

(a) by monitoring the proportions of death certificates (DCN versus DCO) (b) Monitor incidence of each site annually (c) Monitor the difference in incidence rats for the subdivision in incidence rates (d) Sample patient attendance at a specialized clinic

Completeness and accuracy of details All incoming reports or registry abstracts should be checked rapidly upon arrival to ensure that at least all the essential items of information are complete

(a) by blind re-abstraction and recoding without refernce to the original registration (b) In computerized registries data-quality can be checked using automated routines - validation checks: coding control files - consistency checks: cervix uteri in males or prostate in females subsequence date of birth site-specific morphology terms

Pre-requisites for quality control Rules and documentationthe form of a procedural manual

Good coding systemsonly one code is allocated for each appropriate term

Standardsmaximum tolerable error rate 5% for the three-digit level of ICD-O 0.5% for sex

Evaluation of data quality in the cancer registryv v v v Completeness Comparability Validity or accuracy Timeliness

Review papers Bray F, Parkin DM. Evaluation of data quality in the cancer registry: Principles and methods. Part I: Comparability, validity and timeliness. European J of Cancer 2008 Parkin DM, Bray F. Evaluation of data quality in the cancer registry: Principles and methods. Part II: Completenss. European J of Cancer 2008

Comparability The system used for classification and coding of neoplasms; The definition of incidence, i.e. what is defined as a case, and what is the definition of the incidence date; The distinction between a primary cancer (new case) and an extension, recurrence or metastasis of an existing one; The recoding of cancers detected in asymptomatic individuals

Comparability1. International standards for classification and coding of neoplasms (International Classification of Diseases for Oncology ICD-O) 2. Incidence date 3. Multiple primaries 4. Incidental diagnosisComparability

International standards for classification and coding of neoplasm1. ICD-O-3 (2000, WHO) Topography: location of the tumour in the body (T code: C16) Morphology: microscopic appearance and cellular origin of the tumor (M code: 8000) Behavior: whether the tumour is malignant, benign, in situ or uncertain (/3) Grade: the extent of defferentiation of tumour A standard coding scheme is also provided for recording the basis of diagnosis of cancersComparability

Date of diagnosis: Incidence dateSEER Program Coding and Staging Manual 2007 (pp 61-64) ENCR, 1999 -Standards recommended for the definitions of incidence -1. Date of first histological or cytological confirmation -2. Date of admission to the hospital -3. date of first evaluation (outpatient clinic) -4. Date of diagnosis other than 1,2,3 -5. Date of death, if no information is available -6. Date of death at autopsyComparability

Multiple primaries IARC Multiple Primary Rule (2000 and 2004)

International rules for multiple primary cancers ICD-O-3rd ed 2004. IARC Internal Report No. 2004 /02

(2000)Tcode C01 C02 C05 C06 C07 C08 C09 C10 C12 C13 C19 C20 C23 C24 C30 C31 C33 C34 C37 C38.0-3 C38.8 C40 C41 C51 C52 C57.7 C57.8-9 C60 C63 C64 C65 C66 C68 C74 C75

IARC/IACR Multiple Primary RulesDescription Base of tongue Other and unspecified parts of tongue Palate Other and unspecified parts of mouth Parotid gland Other and unspecified major salivary glands Tonsil Oropharynx Hypopharynx Hypopharynx Rectosigmoid junction Rectum Gallbladder Other & unspecified parts of biliary tract Nasal cavity & middle ear Accessory sinus Trachea Bronchus & lung Thymus Heart and mediastinum Overlapping lesion of heart, mediastinum and pleura Bones, joints & articular cartilage of limbs Bones, joints & articular cartilage of other sites Vulva Vagina Other specified female genital Overlapping lesion and female genital tract, NOS Penis Other & unspecified male genital organs Kidney Renal pelvis Ureter Other & unspecified urinary organs Adrenal gland Other endocrine glands & related structures Tcode C01 C02 C00 C03 C04 C05 C06 C09 C10 C12 C13 C14 C19 C20 C23 C24 C33 C34 C40 C41 C65 C66 C67 C68 Description Base of tongue Other and unspecified parts of tongue Lip Gum Floor of mouth Palate Other and unspecified parts of mouth

(2004)*

Groups of topography for a single site

Groups of topography for a single site

C02.9

C06.9

Tonsil Oropharynx Pyriform sinus Hypopharynx Other and ill-defined sites in lip, oral cavity and pharynx Rectosigmoid junction Rectum Gallbladder Other & unspecified parts of biliary tract Trachea Bronchus & lung Bones, joints&articular cartilage of imbs Bones, joints & articular cartilage of other sites Renal pelvis Ureter Bladder Other & unspecified urinary organs

C14.0

C20.9 C24.9 C34.9 C41.9

C68.9

* If diagnosed at different times, code first diagnosis. If diagnosed at the same time use codes given below.

Incidental diagnosis Incidental diagnosis refers to the detection of cancer incidentally 1. Screen detected cancersAim of screening : to detect cancers that are asymptomatic at an earlier stage - increasing with prevalent cancers - tend reduce incidence rates of colon and cervix cancer via the removal of premalignant lesions

2. Autopsy diagnosis

? In AsiaComparability

Validity (accuracy)1. 2. Reabstracting and recoding Histological verification the index of validity: the percentage of cases morphologically verified 3. Death Certificate Only (DCO) 4. Missing information 5. Internal consistency: IARC, IACR CHECK program

Timeliness Rapid reporting of information on cancer cases is another priority There are no international guidelines for timeliness at present, but -North American agencies have set out specific standards for the relevant registries - SEER: with 22 month of the end of the diagnosis year

Completeness1. Semi-quantitative methods 2. Quantitative methods

Semi-quantitative methods Historic data methodsStability of incidence rates over time Comparison of incidence rates in different populations Shape of age-specific curves Incidence rates of childhood cancers

Mortality:Incidence ratios Number of sources/notification per caseAverage number of sources per case Average number of notifications per case

Histological verification of diagnosis

Quantitative methods Independent case ascertainment Capture-recapture method Death certificate methodsDCN/M:I method The flow method Cases recruited into an international clinical follow-up study Patients enrolled into a multi-centre clinical trial Others from various studies

Example: Incidence rates by year

Example: Incidence rates by year

CI5 volume 9 (Editorial sheet 4) Quality indicatorsMale

CI5 volume 9 (Editorial sheet 4) Quality indicatorsFemale

Example

* Average percentage annual change since volume 8 (1995-1997) Significant changes (95% confidence level, Miettinen method, page 869 of volume 6) are marked in bold Rate 0- is significantly different. (24.3 vs 16.0) Rate 5- is significantly different. (12.1 vs 6.5) Rate 10- is significantly different. (13.4 vs 6.0)

Quality Indices:

MV%, DCO%, UNK%, M/I , MV% but C22, MV% but C91-95

B

Data quality and Comparability Criteria CI5 vol IXAComplete coverage Death reporting meet WHO recommendations %Unk, DCO, Illdefined site 80% (99-100% excluded) DCO 0.0 % (DCO:none )

BNo access to death certificates Official mortality data not available by cause or poor quality by cause 10% < %Unk, DCO, ill-defined site 20% MV% too high (99100%) or low for selected sites (overall MV% < 75%) M/I threshold by site Implausible incidence rates v Specialized registries e.g. childhood, mesothelioma Data with