state of the art: methods and tools for archival processing metrics

12
Methods and Tools for Archival Processing Metrics a Eagle Yun, MLIS, CA ng Head of Special Collections and Archives ersity of California, Irvine Libraries Society of California Archivis Annual General Meeting 20 Berkeley, Californ

Upload: audra-eagle-yun

Post on 11-Jun-2015

2.611 views

Category:

Education


3 download

DESCRIPTION

Presented to the 2013 Annual General Meeting of the Society of California Archivists.

TRANSCRIPT

Page 1: State of the Art: Methods and Tools for Archival Processing Metrics

Methods and Tools forArchival Processing Metrics

Audra Eagle Yun, MLIS, CAActing Head of Special Collections and ArchivesUniversity of California, Irvine Libraries

Society of California ArchivistsAnnual General Meeting 2013

Berkeley, California

Page 2: State of the Art: Methods and Tools for Archival Processing Metrics

Why archival metrics?

• Heuristics for time and cost• Evaluate processing techniques• Quantitatively evaluate• Create and revisit benchmarks

Page 3: State of the Art: Methods and Tools for Archival Processing Metrics

A history of archival metrics• Greene and Meissner’s review• Assumptions:

– Tracking includes arrangement, description, and minor conservation

– Archivist works average of 230 days per year• 1976: Charles Schultz study

– 40 cubic feet/year• 1978 and 1982: William Maher study at University of Illinois

– general files 3.0 hours / cubic foot– personal papers 6.9 hours / cubic foot

• 1980: W.N. Davis at California State Archives– 8 hours / cubic foot, average of all staff

Page 4: State of the Art: Methods and Tools for Archival Processing Metrics

A history of archival metrics, continued

• 1982: Karen Temple Lynch and Thomas E. Lynch study of NHPRC and NEH processing grants– 20th century collections average 12.7 hours / cubic

foot– organizational records average 10.6 hours / cubic foot

• 1985: Terry Abraham, Stephen Balzarine, Anne Frantilla at Washington State– graduate workers average 5.5 days / cubic foot (<1

foot) or 3 days / cubic foot (>1 foot) for manuscripts – 2 days / cubic foot for archival series

Page 5: State of the Art: Methods and Tools for Archival Processing Metrics

A history of archival metrics, continued

• 1987: Uli Haller, University of Washington– Large 20th century collections, 3.8 hours / cubic foot– Significant observation: lack of standardization of levels for

processing• 1995: Paul Erickson and Robert Shuster, Billy Graham

Center Archives– Used prior studies to set expectations – Significant observation: “we’re processing more intensively than

we realized or intended”– Actual averages: 15.1 hours and $375 / cubic foot– Important conclusion: “It is almost accepted as a given in the

literature that processing methodologies and local conditions vary so widely from archives to archives that figures developed at one institution are meaningless at another”

Page 6: State of the Art: Methods and Tools for Archival Processing Metrics

Metrics and efficient processing:a love story

• Economic realities• Justification for resources• Gauging techniques

Page 7: State of the Art: Methods and Tools for Archival Processing Metrics

NGTS POT3 LT2B

• Alphabet soupUniversity of California Libraries’ Next-Generation Technical Services, Power of Three Group 3, Lightning Team 2B

• Charge“Define a methodology and identify a data gathering instrument for capturing processing rates, to facilitate cost/benefit analysis of processing approaches. Data collected will additionally assist campuses in evaluating local processing benchmarks.”

Page 8: State of the Art: Methods and Tools for Archival Processing Metrics

Environmental scan & interviews• Literature and resource review

– CLIR UCEC– PACSCL– OCLC – CHoM database and users: NCSU, Princeton

• Email, phone, and in-person interviews– Harvard Medical School Center for the History of Medicine– UCLA Library Special Collections– UCB Bancroft Library– Stanford University Library, Special Collections & University Archives– Free Library of Philadelphia– Princeton University, Seeley G. Mudd Manuscripts Library

• Topics– Key users of tracking tools– Essential data points– Ensuring success/buy-in– Challenges and benefits

Page 9: State of the Art: Methods and Tools for Archival Processing Metrics

Interview findings• Institutions that have implemented a tracking database or system like the Harvard Processing

Metrics Database indicate that these tools can become integrated into the work structure, given team support and involvement in planning and implementation.

• Institutions that have chosen not to use a structured tracking system indicate that these tools are far too complex, granular, and time-consuming than is necessary. A few suggested that the level of detail expected for such systems is incompatible with MPLP techniques.

• Both users and non-users of a complex tracking database commented on the barriers to using Microsoft Access and suggested that a web-based solution would be more user-friendly.

• Interviewees suggested that clear expectations about units of measurement (time and linear feet) and processing plans are among the most useful aspects of processing metrics.

• Interviewees who were not tracking archival processing metrics advocate for resource allocation estimates (time and linear feet) that are created during planning and reviewed upon completion of processing projects.

• Some interviewees expressed concern about the possibility of tracking data being used to assess staff productivity or individual work quality.

• All interviewees discussed importance of tracking processing work in some way in order to justify funding and staffing from resource allocators, as well as to provide more accurate information about expected timeframe and perceived value of archival work. One interviewee suggested that archival metrics data can also assist in collection development decisions, contributing to estimates on how much time and money is needed to make available certain types of collections.

Page 10: State of the Art: Methods and Tools for Archival Processing Metrics

Recommendations• Metrics not as a mandate, but

facilitator of data-driven decision making

• Can help with creating set of common benchmarks across institutions

• Metrics for justification of needs, demonstration of value

• Minimum baseline data elements identified

Page 11: State of the Art: Methods and Tools for Archival Processing Metrics
Page 12: State of the Art: Methods and Tools for Archival Processing Metrics

Available tools

• UC Libraries Baseline Archival Processing Metrics Spreadsheet

• PACSCL/CLIR Hidden Collections Processing Project, Processing Worksheet

• Processing Metrics Collaborative: Database Development Initiative

• UC Libraries Archival Processing Metrics Worksheet

http://uclib-prd-old.cdlib.org/cdc/hosc/efficientprocessing/index.html