full program final

Author: alterindonesia

Post on 14-Apr-2018




0 download

Embed Size (px)


  • 7/27/2019 Full Program Final




    SEPTEMBER 2527, 2006




  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final




    SEPTEMBER 2527, 2006




  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


    September 15, 2006

    Dear colleagues,

    Welcome to the Library Assessment Conference: Building Effective, Sustainable, Practical

    Assessment. We are delighted to hold our first conference in historic Charlottesville. When

    Jefferson donated his books to the Library of Congress, he wrote I ask of your friendship,therefore, to make for me the tender of it to the Library Committee of Congress, not knowing

    myself of whom the Committee consists. In a similar fashion, we ask for your friendship in

    making sure that this effort succeeds.

    We are thrilled by the overwhelming response to the Conference, strong evidence of the blos-

    soming of community awareness regarding library assessment. You will join more than 200

    people participating in a rich three day program of workshops, tours, engaging plenary speak-

    ers, useful concurrent and poster sessions and many opportunities for informal discussion. Theconference focus is on practical assessment that can be used to improve library service, and in-

    cludes sessions on customer surveys, focus groups, learning outcomes, organizational climatesurveys, performance metrics, evaluating electronic services and resources, and related market-

    ing and management issues.

    Your commitment to library assessment is critical to the process of demonstrating the impact

    and connection of the library to the research, teaching and learning process. A library assess-

    ment program can only be as strong as our own ability to learn and adapt to the new challenges

    confronting our organizations. We hope that this library assessment conference is a catalystthat helps all of us address these challenges.

    We want you to be forthcoming and candid with your assessment of our Library Assessment

    Conference after all it is an old saying that the wisest of the wise may err (Aeschylus). A

    learning community of practitioners interested in library assessment may aspire to be thewisest of the wise. Please join us as we learn, build community, and also make library assess-

    ment fun -- wise people know how important that is!

    Steve Hiller, University of Washington, Co-Chair

    Martha Kyrillidou, Association of Research Libraries, Co-Chair

    Jim Self, University of Virginia, Co-Chair

    And the rest of the Conference Planning Committee:Francine DeFranco, University of Connecticut

    Brinley Franklin, University of ConnecticutRichard Groves, Association of Research Libraries

    Lisa Janicke Hinchliffe, University of Illinois at Urbana-Champaign

    Joan Stein, Carnegie Mellon UniversityLynda White, University of Virginia



  • 7/27/2019 Full Program Final



  • 7/27/2019 Full Program Final


    Thank You to Our Reception Sponsors


  • 7/27/2019 Full Program Final



  • 7/27/2019 Full Program Final


    Omni Floor Map


  • 7/27/2019 Full Program Final



  • 7/27/2019 Full Program Final



































































































































  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final








































































































  • 7/27/2019 Full Program Final


    Sunday, September 24

    3:00 p.m. - 8:00 p.m. Early Conference Registration

    Hotel Lobby

    University of Virginia Library staff will be available to answerquestions about Charlottesville and to provide dinner recommendations.

    3:50 p.m. Monticello/Jefferson Library Tour Meets

    Meet in Atrium

    4:00 p.m. Buses Leave for Monticello

    7:30 p.m. Buses Leave Monticello for Omni

    8:00 p.m. Dinner on your Own in the Downtown Mall


  • 7/27/2019 Full Program Final


    Monday, September 25

    8:00 a.m. 6:00 p.m. Conference Registration OpenSalon A/B Prefunction Area

    9:00 a.m. - 12:00 noon Preconference Workshops

    12:00 noon - 1:00 p.m. Lunch on your Own

    1:00 p.m. - 2:00 p.m. Welcome & Opening

    Salon A/B

    Conference Co-Chairs Steve Hiller, Jim Self, and Martha Kyrillidou

    Speaker: Duane Webster, Executive Director, Association of Research Libraries

    2:00 p.m. - 3:00 p.m. Plenary I

    Salon A/B

    John Lombardi, Chancellor University of Massachusetts-Amherst

    Library Performance Measures That Matter

    3:00 p.m. - 3:30 p.m. Break

    3:30 p.m. - 5:00 p.m. Parallel Session 1

    Service Quality Assessment

    Salon A

    LibQUAL+, ProSeBiCA (Development of New Library Services by Means ofConjoint Analysis), and CAPM (Comprehensive Access to Printed Materials)

    Panel: Fred Heath, Colleen Cook, Martha Kyrillidou, Bettina Koeper, Reinhold

    Decker, and Sayeed Choudhury

    How You Can Evaluate the Integrity of Your Library Service Quality Assessment

    Data: Intercontinental LibQUAL+ Analyses Used in Concrete Heuristic Examples

    Bruce Thompson, Martha Kyrillidou, and Colleen Cook

    Qualitative Approaches I

    Salon B

    Wayfinding in the Library: Usability Testing of Physical Spaces

    Nancy J. Kress, David K. Larsen, Tod A. Olsen, and Agnes M. Tatarka

    Data Analysis and Presentation with Joe Zucca

    Salon A

    Introduction to Survey Analysis with Neal Kaske

    Salon B

    Introduction to Focus Groups and Other Qualitative Methods with Colleen CookJames Monroe


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


    Monday, September 25

    5:00 p.m. - 6:00 p.m. Parallel Session 2 continued

    6:00 p.m. - 7:30 p.m. Poster Session & Drinks

    Posters in PrestonBar in Atrium Open 6:15-7:15

    Building Assessment in our Libraries IIAshlawn & Highlands

    Library Assessment on a Budget: Using Effect Size Meta-Analysis to Get the Most out

    of the Library-Related Survey Data Available across CampusEric Ackermann

    Developing an Integrated Approach to Library and Information Technology


    Panel: Jill Glaser, Bill Myers, Ryan P. Papesh, John M. Stratton

    Usage and Outcomes Evaluation of an Information Commons: A Multi-Method PilotStudy

    Rachel Applegate

    Issues in Establishing a Culture of Assessment in a Complex Academic HealthSciences Library

    Sally Bowler-Hill and Janis Teal

    Use of RFID Applications in Libraries

    Navjit Brar

    Statistics & Assessment: The Positive Effects at the Harold B. Lee Library ofBrigham Young University

    Julene Butler and Brian Roberts

    Improving Library Services Using a User Activity SurveyAlicia Estes

    Introducing the READ Scale: Qualitative Statistics for Academic Reference Services

    Bella Karr Gerlich and G. Lynn Berard

    Are the Needs and Wants the Same? Comparing Results from Graduate Student,

    Undergraduate Student, and Faculty Surveys

    Lisa Janicke Hinchlifee and Tina E. Chrzastowski

    Challenges Inherent in Assessing Faculty Productivity: A Meta-Analysis Perspective

    Sheila Curl Hoover

    Assessing the Research Impact of Electronic Journals at the University of Notre


    Carol Branch, Carole Pilkinton, and Sherri Jones


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


    Tuesday, September 26

    11:00 a.m. - 12:00 noon Parallel Session 4 continued

    12:00 noon - 1:15 p.m. Luncheon (No Speaker)


    1:30 p.m. - 2:30 p.m. Plenary II

    Salon A/B

    Cathy De Rosa, Vice President, Marketing & Library Services, OCLC

    Changing User Needs and Perceptions

    2:30 p.m. - 3:00 p.m. Break

    3:00 p.m. - 5:00 p.m. Parallel Session 5

    Evaluation and Assessment MethodsSalon B

    Choosing the Best Tools for Evaluating Your Library

    Neal Kaske

    Developing Best Fit Library Evaluation Strategies

    Charles R. McClure, John Carlo Bertot, Paul T. Jaeger and John T. Snead

    Strategic PlanningAshlawn & Highlands

    Accountability to Key Stakeholders

    Raynna Bowlby and Daniel OMahony

    Drilling the LibQUAL+ Data for Strategic PlanningStewart Saunders

    Library As PlaceSalon A

    Assessing Learning Spaces: A Framework

    Joan K. Lippincott

    Combining Quantitative and Qualitative Assessment of an Information Common

    Gordon Fretwell and Rachel Lewellen

    Listening to Users: The Role of Assessment in Renovation to Meet User NeedKimberly Burke Sweetman and Lucinda Covert-Vale

    Net Generation College Students and the Library as Place

    Aaron K. Shrimplin and Matthew Magnuson


  • 7/27/2019 Full Program Final


    Tuesday, September 26

    3:00 p.m. - 5:00 p.m. Parallel Session 5 continued

    5:00 p.m. - 6:30 p.m. Buses Depart Omni for University of Virginia

    Meet in Atrium

    Please gather for the buses at the Mall entrance of the Atrium. For those who would

    like to visit the UVa Lawn or view exhibits at Harrison Institute/Small SpecialCollections Library, the buses will leave at 5:00pm. Doors to the reception will not

    open until 6:00pm.

    6:00 p.m. - 8:00 p.m. Reception at Harrison Institute / Small Special Collections Library

    Speaker: Karin Wittenborg, University Librarian, University of Virginia

    7:45 p.m. - 8:20 p.m. Buses Depart University of Virginia for OmniBus pick up location will be the same as drop off in front of Harrison Institute/SmallSpecial Collections Library. Look for the buses with Library

    Assessment Conference signs returning to the Omni.

    8:30 p.m. - 10:30 p.m. Drinks and Conversation with Library Luminaries

    Downtown Mall Bars

    Balanced ScorecardSalon B

    Balanced Scorecards in Public Libraries: A Project Summary

    Joe Matthews

    The People Side of Planning & Implementing a Large Scale Balanced Scorecard


    Susanna Pathak

    Yours, Mine, and Ours: Staff Involvement in the Balanced Scorecard

    Panel: Leland Deeds, Tabzeera Dosu, Laura Miller, Paul Rittelmeyer, AnnetteStalnaker, Donna Tolson, and Carol Hunter

    Assessing Organization Climate

    Ashlawn & Highlands

    From Organizational Assessment to Organizational Change: The University of

    Maryland Library Experience

    Panel: Sue Baughman, Johnnie Love, Charles B. Lowry, and Maggie Sopanaro

    Diversity and Organizational Culture Survey: Useful Methodological Tool or

    Pandoras Box

    Laura Lillard

    Looking In and Looking Out: Assessing Our Readiness to Embrace the Future

    Nancy Slight-Gibney


  • 7/27/2019 Full Program Final


    Wednesday, September 27

    7:30 a.m. - 9:00 a.m. Registration Open

    Salon A/B Prefunction Area

    7:30 a.m. - 9:00 a.m. Continental Breakfast

    Salon A/B

    9:00 a.m. - 10:00 a.m. Plenary III

    Salon A/B

    Paul Hanges, Professor, Industrial and Organizational Psychology,

    University of MarylandOrganizational Diversity and Climate Assessment

    10:00 a.m. - 10:30 a.m. Break

    10:30 a.m. - 12:00 noon Parallel Session 6

    Organizational Culture/LearningSalon A

    Assessing Organizational Culture: Moving towards Organizational Changeand Renewal

    Lyn Currie and Carol Shepstone

    Tools for Creating a Culture of Assessment: The CIPP Model and Utilization-

    Focused EvaluationYvonne Belanger

    The use of Outcome Based Evaluation (OBE) to Assess Staff Learning Activities atUniversity of Maryland Libraries

    Irma F. Dillon and Maggie Sopanaro

    Digital Library

    Salon B

    Usability Assessment of Academic Digital Libraries

    Judy Jeng

    Listening to Users: Creating More Useful Digital Library Tools and Services by

    Understanding the Needs of User Communities

    Felicia Poe

    All That Data: Finding Useful and Practical Ways to Combine Electronic Resource

    Usage Data from Multiple SourcesMaribeth Manoff and Eleanor Read


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


    Monday, September 252:00 p.m. - 3:00

    Plenary I Salon A/B

    Library Performance Measures That Matter

    John LombardiChancellor, University of Massachusetts-AmherstCo-editor, The Center

    John Lombardi will speak in the general area of what a university administrator needs to know from thelibrary. He has been the major force behind The Center which measures the performance of Americanresearch universities.



  • 7/27/2019 Full Program Final


    Monday, September 253:30-5:00

    Parallel 1 #1Service Quality Assessment

    Salon A

    LibQUAL+, ProSeBiCA (Development of New Library Services by Means ofConjoint Analysis), and CAPM (Comprehensive Access to Printed Materials)

    Panel: Fred Heath, Colleen Cook, Martha Kyrillidou, Bettina Koeper, Reinhold Deckerand Sayeed Choudhury

    Introduction by Fred Heath, Colleen Cook and Martha Kyrillidou regarding LibQUAL+ and its relationto the methodologies described by the other panelists.

    Bettina Koeper, ProSeBiCA (Development of New Library Services by Means of Conjoint Analysis)In the context of changing educational environments current discussions about the strategic development ofacademic libraries clearly show the need of a basic change in their self-conception, turning from mereacademic institutions to service provider who actively design and offer services that fit to the users needsand preferences. More and more costumer acceptance of library services becomes a new quality standard

    that will have significant effects on the libraries status within the universities. But how to achieve thisacceptance? How to get a clear idea of user preferences in order to shape future library services?

    The project ProSeBiCA, funded by the German Research Foundation (DFG) and carried out by the Chair ofMarketing at Bielefeld University and Bielefeld University Library, tries to give an answer to that questionby adapting conjoint analysis as a marketing research method to the library context. Differing from surveysthat refer to the evaluation of existing services conjoint analysis implies a tool based on preferencemeasurement that allows to get a profound knowledge about the users requests towards services alreadyavailable as well as potential ones. Thus ProSeBiCA aims at the development of a comprehensive analysisand simulation framework for academic libraries which enables a founded strategic planning of futureservice design. Its portability will be audited in cooperation with the University of Cottbus. An intensiveexchange of information with the Sheridan Libraries of the Johns Hopkins University, USA, completes theproject and led to further cooperation considering CAPM and LibQUAL+.

    Sayeed Choudhury, CAPM (Comprehensive Access to Printed Materials)As part of JHUs CAPM Project (http://ldp.library.jhu.edu/projects/capm), Choudhury led the developmentof an evaluation methodology using multi-attributed, state-preference techniques. Multi-attribute, stated-preference methods feature choice experiments to gather data for modeling user preferences. In the choiceexperiments, often expressed as surveys, subjects state which alternatives (services or features) they mostprefer; the alternatives are distinguished by their multi-attributes. This approach was used to considertradeoffs in varying attributes for a specific service of access to materials in an off-site shelving facility.Patrons were asked to choose varying times for delivery, access to images, and ability to search full-text,along with differing (hypothetical) fees for each of the attributes.

    During the 2002 JISC/CNI Conference, Choudhury mentioned the possibility of integrating theLibQUAL+ and CAPM assessment methodologies. LibQUAL+ helps identify gaps in a wide range of

    library services, but the question of priorities among the gaps is not immediately addressed. The CAPMmethodology explicitly explores patrons preferences or choices for implementing a particular libraryservice. Given the different, but complementary areas of emphasis and different theoretical underpinnings,there seemed to be potential for an integrated, and more comprehensive, approach. Fundamentally, the`outputs from a LibQUAL+ analysis can provide the inputs for a multi-attribute stated-preferenceanalysis, which acknowledges the need for tradeoffs when making decisions regarding resource allocation.Even with this promising idea, there was arguably too large a difference in the levels of granularitybetween the two methodologies. Bielefelds ProSeBiCA provides the appropriate bridge.


  • 7/27/2019 Full Program Final


    Fred Heath, Vice Provost and Director of Libraries, University of Texas

    Colleen Cook, Dean and Director of the Texas A&M University Libraries

    Martha Kyrillidou, Director of Statistics and Service Quality Programs

    Bettina Koeper, ProSeBiCA (Development of New Library Services by Means of Conjoint Analysis)

    Reinhold Decker, Professor Dept of Business Admin. of Econ, Bielefeld University, Germany

    Sayeed Choudhury, CAPM (Comprehensive Access to Printed Materials)



  • 7/27/2019 Full Program Final


    Monday, September 253:30-5:00

    Parallel 1 #1Service Quality Assessment

    Salon A

    How You Can Evaluate the Integrity of Your Library Service Quality Assessment Data:Intercontinental LibQUAL+ Analyses Used in Concrete Heuristic Examples

    Bruce Thompson, Martha Kyrillidou, and Colleen Cook

    This user-friendly, conversational presentation explains how you can evaluate the integrity ortrustworthiness of library service quality assessment data. Using the metaphor of a bathroom scale, theideas underlying (a) score reliability and (b) score validity are presented in an accessible manner. The useof the software, SPSS, to compute the related statistics is illustrated. LibQUAL+ data are used in heuristicexamples, to make the discussion concrete, but the illustrations apply to both new and other measures oflibrary service quality.

    Martha Kyrillidou, Director of Statistics and Service Quality Programs

    Bruce Thompson, Distinguished Professor of Educational Psychology and CEHD Distinguished ResearchFellow, and Distinguished Professor of Library Science, Texas A&M University, and Adjunct Professor ofFamily and Community Medicine, Baylor College of Medicine (Houston)

    Colleen Cook, Dean and Director of the Texas A&M University Libraries



  • 7/27/2019 Full Program Final


    Monday, September 253:30-5:00

    Parallel 1 #3Qualitative Approaches I

    Salon B

    Wayfinding in the Library: Usability Testing of Physical Spaces

    Nancy J. Kress, David K. Larsen, Tod A. Olsen, and Agnes M. Tatarka

    Its common for libraries to evaluate the usability of online information systems by asking a subject tothink out loud while they perform such tasks as finding a book title or journal article. This same techniquecan be to adapted to determine how well individuals are able to find and retrieve information in thephysical environment.

    This paper discusses the benefits and utility of even small-scale wayfinding studies as a tool forhighlighting barriers to the use of library collections. A primary benefit is that wayfinding studies allowlibrarians and other staff to see their library space through new eyes and better understand how difficult itcan be for novice users to find library materials. This knowledge can inform efforts to reconfigure librariesand underscore areas for instruction. Wayfinding studies can also evaluate the effectiveness of libraryorientation programs and directional aids.

    An example illustrating the utility of wayfinding studies is a recent assessment at the University of ChicagoLibrary. When this library participated in the spring 2004 LibQual+ survey, it received many commentsfrom users that books were frequently not on the shelf. However, a follow-up study showed over a fifth ofthe books that patrons reported being unable to find were found in place on the shelf. So, in the Spring of2005, a team at the University of Chicago Library undertook a study to help identify the reasons users werehaving trouble finding books in the Regenstein Library, a large building housing over 4 million volumes ofmaterial in the Humanities and Social Sciences in open bookstacks.

    To discover failure points in the book retrieval process, users were directly observed throughout the process

    from finding the catalog record and recording the necessary information to using online and printed guidesand maps to navigate their way to the correct shelf.

    Subjects were recruited from first-year students who were asked to complete a brief online questionnaire ontheir Library use. Subjects were selected who had little history of checking out material from the Library inorder to best approximate the new user experience. Subjects were given full bibliographic citations to threebooks and asked to follow the "think-out-loud" protocol while conducting the catalog search, interpretingthe record, and locating the books in the collections. As subjects attempted to complete their tasks, theywere monitored by a facilitator and a note taker from the study team.

    The study revealed problems with the terms used to designate locations, the arrangement of physicalcollections, the lack of an effective map and signage system, failure to distinguish between reference andcirculating collections, and highlighted difficulties in reading call numbers. This compelling information is

    being used to reconfigure spaces improve directional aids, and inform our orientation and instructionprograms - changes which will be assessed for effectiveness through additional wayfinding studies.


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final




  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


    Monday, September 253:30-5:00

    Parallel 1 #2Building Assessment in our Libraries I

    Ashlawn & Highlands

    Getting Started with Library Assessment: Using Surveys to Begin an AssessmentInitiative

    Lisa Janicke Hinchliffe and Tina E. Chrzastowski

    Developing a library assessment program is often a challenging task. Librarians and staff may questionallocation of resources to assessment activities and feel threatened by potential results. This paper presentsa case study for using library user surveys as the foundation for an evolving assessment program andrelated organizational development activities.

    The University Library at the University of Illinois at Urbana-Champaign (UIUC) undertook a three-yearprogram of patron surveys to determine attitudes towards the librarys services, facilities and collectionsand to begin a library assessment program. This initial foray into library-wide assessment was administeredby the Librarys Services Advisory Committee. The first group surveyed (spring 2004) consisted ofgraduate and professional students, followed by undergraduate students (spring 2005) and faculty (spring2006).

    This series of surveys marks the beginning of formal library assessment at UIUC. Although the UIUClibrary participated in LibQual surveys in the past and individual UIUC librarians have been activelyconducting library assessment at the unit level for many years, these patron surveys represent a newcommitment to library-wide assessment at UIUC. Further opportunity for assessment education was madepossible through participation in the Association for Research Libraries (ARL) program Making LibraryAssessment Work: Practical Approaches for Developing and Sustaining Effective Assessment. SteveHiller and Jim Self, ARL Visiting Program Officers for library assessment, visited the UIUC Library onMay 23-24, 2005. This visit, sponsored by the Services Advisory Committee, prompted discussion aboutlibrary assessment and provided an outsiders view of our current assessment plans and projects.

    This paper will present the results from the three user-surveys and will also focus on the getting started inlibrary assessment experiences of the UIUC Library. Strategies employed by the Services AdvisoryCommittee to promote assessment and begin to create a culture of assessment will be presented, as well ascurrent plans, successes and failures and our assessment directions for the future.

    Lisa Janicke Hinchliffe is Coordinator for Information Literacy Services and Instruction and AssociateProfessor of Library Administration at the University of Illinois at Urbana-Champaign. Her researchfocuses on library use and topics related to teaching and learning.

    Tina E. Chrzastowski is Chemistry Librarian and Professor of Library Administration at the University ofIllinois at Urbana-Champaign. Her research focuses on library use and users.



  • 7/27/2019 Full Program Final


    Monday, September 253:30-5:00

    Parallel 1 #2Building Assessment in our Libraries I

    Ashlawn & Highlands

    A Leap in the Right Direction: How a Symbiotic Relationship Between Assessment andMarketing Moves the Library Forward

    Melissa Becher and Mary Mintz

    Assessment activities are never ends in themselves. The resulting data can be put to work throughout thelibrary, but particularly in marketing efforts. Assessment identifies user populations that would benefitfrom targeted marketing and also documents their awareness of library services. Marketing campaigns tothese populations address gaps in understanding and forge meaningful relationships between users and thelibrary. Users' resulting growing satisfaction can be measured by further assessment, providing evidence ofthe marketing campaign's effectiveness.

    In this presentation, the authors show how a relationship between assessment and marketing developed at

    American University Library, how it led to an award-winning marketing campaign, and how it continues toinform joint assessment and marketing efforts that move the library forward.

    The AU Library Assessment Team participated in LibQUAL+ in 2001. The results from that survey weresurprisingly negative for undergraduate students, prompting the team to conduct special focus groups in fall2002. Further participation in LibQUAL+ 2003 and analysis of the results of university-conducted surveysconfirmed the team's understanding of undergraduate perceptions of the library. All data indicated thatundergraduates had a lower level of familiarity with the library and a lack of awareness of library resourcesand services.

    The Library Marketing Team used Assessment Team data and analyses to identify undergraduate studentsas a group prime for targeted marketing. The Marketing Team saw that the library could make significantgains by increasing undergraduate awareness of the resouces and services available to them. Team

    members initiated a formal series of meetings with Assessment Team members as part of the planningprocess for a fall 2004 campaign. The relationship between the two teams insured that the campaignaligned with Assessment Team findings about the undergraduate population. This marketing campaignwon the 2005 Best Practices in Marketing Academic and Research Libraries @ Your Library Award fromACRL.

    Initial results from LibQUAL+ 2005 show that American University undergraduates' perceived level oflibrary service has moved closer to their desired level of service. While it is hard to say that the marketingcampaign was the only source of the increase, the results are promising enough to explore further. TheMarketing and Assessment Teams plan more assessment to track long-range changes in perceptions ofstudents who first matriculated in 2004 and will graduate as the class of 2008. These activities willdetermine if the fall 2004 campaign has effectively reached a goal of increasing undergraduate studentsatisfaction with and use of library services by at least twenty percent over four years. The relationship

    between assessment and marketing at American University will continue to be essential to attaining thisgoal.

    Melissa Becher has an M.S.L.I.S. from the University of Illinois at Urbana-Champaign. AReference/Instruction Librarian, she leads the American University Library Assessment Team.

    Mary Mintz has an M.S.L.S. from University of North Carolina at Chapel Hill. She is Senior ReferenceLibrarian and a founding member of the Library's Marketing Team.


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


    Monday, September 255:00-6:00

    Parallel 2 #1LibQUAL+ Follow-up

    Salon A

    Getting Our Priorities in Order: Are Our Service Values in Line with the CommunitiesWe Serve?

    Jocelyn Duffy, Damon Jaggars, and Shanna Smith

    LibQUAL+ is used by academic libraries from more than 500 institutions, including colleges,universities, and community colleges, as a method of measuring users perceptions of service quality. Theinstrument allows users to rate their minimum, perceived, and desired levels of service for 22 items in threedimensions: information control, library as place, and service affect. Using the results from the 2005 surveyat the University of Texas at Austin, we examine how well the service priorities of library staff are alignedwith the priorities of undergraduates, graduate students, and faculty.

    To define the priorities for a given individual, we re-scaled the desired score for each item to reflect thedegree to which the item is above or below the average desired level across all items for that individual.The rescaled scores (termed priority scores) for the 22 items were then compared between the fourgroups using a multivariate analysis of variance (MANOVA). Preliminary results indicate that servicepriorities for library staff align more closely with those of undergraduates than with those of graduatestudents and faculty.

    This analysis is a first step in identifying service priority gaps between library staff and the users theyserve. Our intention is to promote discussion among library staff about users needs and how closely staffservice priorities align with those needs. In addition, our findings may prove useful as managementinformation by allowing us to analyze our users service priorities and integrate the results of this analysisinto organizational decision-making and planning processes.

    This paper will focus on the University of Texas Libraries, but the question answered and method of

    analysis will be useful to all libraries with a similar data set. We believe this will be a unique utilization ofLibQUAL+ data, as we have not found a similar study within the existing LibQUAL+ literature.

    Jocelyn Duffy is the Assessment Coordinator for the University of Texas Libraries at the University ofTexas at Austin. She has made presentations on LibQUAL+ and service issues and participated ininformation fairs at local and national meetings and library conferences.

    Damon Jaggars is the Associate Director for Student & Branch Services for the University of TexasLibraries. He has presented on service quality assessment in libraries at several regional and nationalconferences.

    Shanna Smith is the Manager of the Research Consulting group at the University of Texas at Austin, where

    she consults with clients from a variety of disciplines on data collection and analysis issues. She haspresented papers with methodological and statistical content at several regional, national, and oneinternational conference.



  • 7/27/2019 Full Program Final


    Monday, September 255:00-6:00

    Parallel 2 #3Qualitative Approaches II

    Salon B

    Meliora: The Culture of Assessment at University of Rochesters River Campus Libraries

    Nora Dimmock, Judi Briden, and Helen Anderson

    The River Campus libraries at the University of Rochester have created an effective and sustainable userneeds assessment program by establishing a website Usability Group, undertaking an undergraduateresearch study, and by putting statistics and a desktop report program on the librarians desktop. Theinformation gathered by these programs has helped us to develop quantitative as well as qualitative pictureof our users, allowing us to connect with them through our website, reference services and our collections.

    Nora Dimmock will give an overview of the Usability Groups contribution to the culture of assessmentand demonstrate how it has become an integral part of the website design process at River CampusLibraries. Group members are assigned to a specific project under development. They conduct user studiesthroughout the design cycle. Collaboration with content and design groups gives the advantage of a muchlarger website development program using an iterative design process that is scalable, sustainable andsuccessful.

    Judi Briden will describe the Undergraduate Research Project, a two-year study conducted by a team oflibrarians and staff, including an anthropologist. The study applied ethnographic methodologies to gain newperspectives on how undergraduates working on papers and other research-based assignments interactedwith the libraries resources. Methods included recorded interviews in and out of the library, photo surveys,mapping, and dorm visits. The resultant recordings, drawings and photographs were co-viewed anddiscussed by reference librarians and team members. This shared process generated new insights forimproving reference outreach, library facilities and web pages for undergraduates at the University ofRochester.

    Helen Anderson will discuss the Libraries subject liaison and collection development program. Subjectlibrarians use skills and techniques developed through participation in groups such as the UndergraduateResearch Project, the Usability Group and content groups to develop relationships with students and facultyand to learn about how those groups use library services and collections. They are encouraged to thinkabout collections and access in broad terms. Tools such as our Bibliographers Desktop empower staff tocreate their own collection related reports.

    All of these groups contribute to the culture of assessment that has evolved at the River Campus Librariesover the last five to ten years.

    Nora Dimmock, is Head of the Multimedia Center, Film Studies Librarian and a member of the Usability


    Judi Briden, Digital Librarian for Public Services and subject librarian for Brain and Cognitive Sciencesand a member of the Undergraduate Research Team.

    Helen Anderson is Head, Collection Development for River Campus Libraries and Russian StudiesLibrarian and a member of the Undergraduate Research Team.


  • 7/27/2019 Full Program Final




  • 7/27/2019 Full Program Final


    Monday, September 255:00-6:00

    Parallel 2 #2Building Assessment in our Libraries II

    Ashlawn & Highlands

    Library Assessment on a Budget: Using Effect Size Meta-Analysis to Get the Most out ofthe Library-Related Survey Data Available across Campus

    Eric Ackermann

    Data related to library service quality can exist in the results of surveys conducted across campus for non-library reasons. These surveys can range from the nationally administered Higher Education ResearchInstitute and Faculty Survey (HERI) to the locally generated and administered freshman orientation coursesatisfaction surveys. At many colleges and universities these surveys are conducted regularly and providespace for local questions which can include several library-related items. For libraries with limitedassessment budgets, getting several library-related questions on these surveys can be an inexpensive sourceof additional information about its user needs and their perceptions of library service quality. It doeshowever leave one with the problem of making sense of data from many different survey instruments oftenusing a bewildering array of sampling strategies, scales, data analyses, and outcomes reporting. Onesolution is to use meta-analysis. Meta-analysis is a quantitative method of research synthesis developed inthe social sciences to handle data comparisons across disparate studies in a statistically valid manner.

    This study explores the use of meta-analysis as a library assessment tool, in particular one type of meta-analysis, effect size. It is used to compare the results from six analogous, library-related survey items fromtwo different survey instruments administered by Radford University to its undergraduates in 2005:LibQUAL+ and the Radford University Undergraduate Exit Survey. The six item examined are hours ofoperation, access to information, staff quality, collection quality, users ability to find information andusers ability analyze information. The process of effect size meta-analysis and its results are examined forits strengths and limits as a library assessment technique in light of its practicality, sustainability, andeffectiveness.

    Eric Ackermann (M.S. in Information Sciences, University of Tennessee-Knoxville, 2001) is currently theReference/Instruction and Assessment Librarian at Radford Universitys McConnell Library. He managedthe LibQUAL+ survey of students in 2005 and faculty and staff in 2006. He is a member of the VirginiaAssessment Group and VLACRLs Assessment SIG.



  • 7/27/2019 Full Program Final


    Monday, September 255:00-6:00

    Parallel 2 #2Building Assessment in our Libraries II

    Ashlawn & Highlands

    Developing an Integrated Approach to Library and Information Technology Assessment

    Jill Glaser, Bill Myers, Ryan P. Papesh, John M. Stratton

    This presentation will provide an overview of the activities of the Information Services AssessmentCouncil within the changing organizational climate at the University of Kansas. Ten years ago, threeformerly separate divisions were integrated to form a larger organizational entity known as InformationServices (IS). These divisions are the KU Libraries, Information Technology, and Networking andTelecommunications Services. Though each division maintains a separate identity and is overseen byseparate administrative hierarchies, each division reports to the Vice Provost for Information Services.

    In recognition of the common needs and challenges our users face in the technologically advanced andinterconnected scholarly information environment, each division is working more collaboratively than everbefore. In this milieu, one constant need remains: the need to recognize and predict, as far as possible, thechanges in users needs and behaviors and to measure overall IS effectiveness in meeting them.

    Embedded within the IS management structure are several groups charged with both leading initiatives andadvising leadership of organization-wide activities designed to meet the larger University mission. TheInformation Services Assessment Council has as its charge the responsibility to oversee the assessmentactivities of the three divisions within Information Services, and strives to enable evidence-based decision-making by fostering and supporting a culture of assessmentan ongoing process in which services,resources and performance are measured against the expectations of users, and improvements are made tomeet users needs effectively and efficiently.

    To accomplish this, the Assessment Council collaborates with and advises IS leadership and staff to

    identify priority assessment activities; develops and coordinates assessment-related staff development andeducational programs; assists staff in developing and conducting assessment activities; and reports to theIS-wide community on assessment activities and initiatives.

    This presentation will identify the challenges and opportunities of conducting user-centered assessment in afully integrated library/IT organization. It will describe the similarities and differences of these cultures,the steps that have been taken to use assessment as a unifying theme for approaching user services, andreview several assessment activities, staff development activities, and planning initiatives that haveresulted.

    Jill Glaser has earned a Bachelor of Arts in Music, Piano Emphasis, and a Master of BusinessAdministration, Information Technology Emphasis, both from the University of Kansas. Jill has worked

    primarily as a Web developer for the last 8 years, previously for IBM and Sprint. She currently serves asWeb Services Coordinator for the University of Kansas Libraries, in addition to other Web developmentresponsibilities in the Information Technology department.

    Bill Myers is director of assessment for information services at the University of Kansas, a new positioncreated to facilitate an integrated libraries/IT assessment program at KU. Bill was formerly director oflibrary development and assessment coordinator for the KU Libraries. He received the B.A. and M.A. inEnglish from Fort Hays State University (Kansas).


  • 7/27/2019 Full Program Final


    Ryan R. Papesh has earned a Bachelor of Science in Engineering from Purdue University, and a Master ofScience in Management from North Carolina State University. Ryan has worked extensively in themarketing of technology, in several industries, coast to coast. His choice of technology for the last 15 yearsis telecommunications, and he currently serves as the Customer Service Manager at Kansas University's

    Department of Networking and Telecommunications Services.

    John M. Stratton currently serves as Head of Outreach Services for the University of Kansas Libraries sinceApril of 2005. Prior to that, he served as Head of the Regents Center Library at the KU Edwards Campusin Overland Park, Kansas, and as Co-coordinator of Reference Services for KU Libraries. John receivedhis B.A. in History from KU, and a Master of Science in Library and Information Science from theUniversity of Illinois, Urbana-Champaign.



  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


    Monday, September 256:00-7:30

    Poster Preston

    Issues in Establishing a Culture of Assessment in a Complex Academic Health SciencesLibrary

    Sally Bowler-Hill and Janis Teal

    Objective: To report on the University of New Mexico (UNM) Health Sciences Library and InformaticsCenter's (HSLIC) experience in creating a culture of assessment through the regular administration ofcustomer satisfaction surveys for its library and technology support services.

    Setting: HSLIC's organizational structure includes four major divisions, reflecting its responsibilities todeliver the following services to the Health Sciences campus:

    Library services Technology support (workstation support, network, email, file storage, web administration, and

    application development)

    Educational development (consultation on educational content for online and in-person learningexperiences)

    Biomedical informatics training and consultationGoal: To create a unified culture of assessment in which services are assessed regularly, contributing to apicture of the overall effectiveness of HSLIC.

    Methods: The management team committed to administration of annual surveys, initially of libraryservices and then of technology support. Library services surveys began in 2002 with a survey developedin-house and continued in 2003 and 2005 using the LibQUAL+ survey. A technology support surveywas developed in-house and administered in 2004 and 2006 because environmental scans, literaturesearches, and a survey of the Association of Academic Health Sciences Libraries (AAHSL) directors did

    not reveal any national standardized surveys such as LibQUAL+ for technology support services. Theprocess of conducting an environmental scan and adopting or developing assessment measures for otherservices such as educational development and biomedical informatics is being planned.

    Results: A difficulty arose in comparing the gap analysis scores from LibQUAL+ with the Likert scaleresults from the technology support survey. This difficulty illustrates that having separate surveys usingdifferent methodologies limits HSLIC's ability to integrate survey data and assess overall strengths andweaknesses. It also impedes the development of a unified culture of assessment by compartmentalizingservice units. Further, the technology supportsurvey does not afford the opportunity to benchmark againstsimilar institutions. For libraries like HSLIC, whose responsibilities extend beyond traditional libraryservices, the use of different survey tools to assess different types of services presents problems inconsistency of data interpretation, benchmarking, and strategic planning.

    Conclusions: As HSLIC further develops its technology support survey and begins to evaluate assessmentmeasures for other services, the cost-benefit of creating in-house surveys that better align withLibQUAL+ versus accepting inherent discrepancies derived from using different methodologies will beevaluated. The result will be a unified body of assessment which contributes to a picture of overalleffectiveness of HSLIC services, creating a unified culture of assessment in the organization.


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


    Monday, September 256:00-7:30

    Poster Preston

    Use of RFID Applications in Libraries

    Navjit Brar

    The adoption of Radio Frequency Identification (RFID) technology by libraries promises a solution thatcould make inventorying of items in their collection possible in days instead of months and allow patrons tocheck out and return library property automatically at any time of the day. With an estimated 35 millionlibrary items tagged worldwide in over 300 libraries, this technology is generating an ever increasinginterest. Besides speeding up checkouts, keeping collections in better order and alleviating repetitive straininjuries among librarians, RFID promises to provide a better control on theft, non-returns and misfiling of alibrarys assets.

    The Industrial Technology Department and Robert E. Kennedy Library at Cal Poly State University, SanLuis Obispo, collaborated in a research project to test and assess the effectiveness of RFID technologies inthe library setting. From October to November, 2004, we surveyed participating libraries, RFID listserv,and LITA-L listserv subscribers to collect information with regards to the implementation of RFID systemsin libraries. As a result of the positive response from library world, vendors interest in loaning us with thistechnology for testing, and our own students interest that would prepare them for a better job market, wedecided to further conduct research by actually testing this system. Libramation provided us with theequipment for testing during spring 2005. With the RFID simulation of 250 items, this project addressedcommon issues of contention for any library. The hypotheses tested were that the system will simplify thecheck in and checkout process for staff, increase efficiency, and minimize repetitive motion; provide a self-check component; be able to secure magnetic media such as videos and cassettes, and be able to handle thedischarge function in a manner similar to books; provide a link between security and the bibliographicrecord; provide an efficient way to inventory the librarys collection without having to physically handle

    each item; provide a flexible system that could be used easily with new and future technology, such as anautomated materials handling system; and combining RFID technology at the circulation desk, self-checkmachines, and eventually an automated materials sorting system, will free circulation staff to perform directpatron information services.

    This poster session will inform attendees of our survey findings; workflows; test results particularly usingLibramation & Innovative Interfaces; pre and post implementation costs; the effect of RFID on Library as aPlace; and the conclusion.

    Navjit Brar did her BA in Sociology & Psychology and MA in Sociology from Panjab University; MLISfrom San Jose State University. She began her career as Library Assistant at USC and CSU, Fullerton; heldprofessional positions at Michigan State and New Jersey. Currently working at Cal Poly Kennedy Library

    as an Assistant Dean, ABS.



  • 7/27/2019 Full Program Final


    Monday, September 256:00-7:30

    Poster Preston

    Statistics & Assessment: The Positive Effects at the Harold B. Lee Library of BrighamYoung University

    Julene Butler and Brian Roberts

    Acting on their assumption that assessment is central to successful library service, administrators atBrigham Young Universitys Harold B. Lee Library began to establish a culture of assessment as part oftheir long-term strategic plan. Commencing in early 2001 when the Library repurposed a position to hirean assessment officer, the Library enhanced its assessment activities through participation in a variety ofnational and international studies. Four times between 2001 and 2006 the Library participated inLibQUAL+, twice with other libraries in the Consortium of Church Libraries & Archives. During thespring of 2005, the Library evaluated reference services through involvement in WOREP and alsoparticipated in SAILS to assess the effectiveness of its information literacy programs. Several in-housestudies have been conducted, including an assessment of the role of subject specialists (2001),improvement of specific internal workflows and processes (2001), uniform collection of reference statistics(2003), usability of the library web site (2005), and analysis of future collection space needs (2005 &2006). Findings from each of these studies have resulted in improvements to facilities and services,including establishment of an expedited acquisition system for urgently needed materials, creation of anInformation Commons, and allocation of funding for adding journals to library collections.

    This paper describes the major assessment studies conducted by the Harold B. Lee Library since 2001,explains specific changes that have resulted from those studies, and discusses the impact assessmentactivities have had on library resources and organizational structure.

    Julene Butler: Associate University Librarian, Lee Library, BYU. Ph.D., Communication, Information,

    Library Studies, Rutgers University, 1996. MLS: BYU, 1971.

    Brian Roberts: Process Improvement Specialist, Lee Library, BYU. MS, Statistics, BYU, 1983. BS,Business Statistics, BYU, 1980.



  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


    Monday, September 256:00-7:30

    Poster Preston

    Improving Annual Data Collecting: An Interactive Poster Session

    Linda Miller

    In 2005, IRIS Research & Assessment Services (RAS) took on responsibility for Cornell UniversityLibrary's (CUL's) annual data collection. With minimal resources, RAS worked to make the 2004/05 datacollection easier and more transparent for data providers and library managers. It developed a table of themeasures included in the most recent annual statistical report and reoccurring national surveys (ordered byfunctional area) to help familiarize staff with all measures and ensure that all core data was collected at onetime; developed an expanded definitions file to promote consistency and to support data coordinators;created Excel files to facilitate data input and management, and to allow for percentage changecomparisons with 2003/2004 data; encouraged and made it easier to include more notes; collected a largepart of the centrally-provided data before the call to individual units so they had more time to reviewfigures provided for their libraries; expanded instructions and provided training sessions; made it moreexplicit to whom the data was being reported; and involved the reporting units in data verification andanalysis.

    In 2004/05, RAS also started to think about how to update tables to mainstream e-resource statistics andmake the presentation of data more useful to a wider variety of audiences. Finally, RAS requestedfeedback from library managers in various forums. In 2006, RAS is building upon this earlier work. Toensure that current and future data collection efforts are as meaningful as possible, RAS asked eachfunctional area's executive committee to take "ownership" of tables representing their areas, includingsetting and defining measures to be collected, and assisting in data review and analysis. We envision thatthis ongoing, cyclical process, involving staff throughout the library, will allow us to create a solid (andmore easily gathered and shared) set of repurposable data to support a full assessment program - one that

    will incorporate both quantitative and qualitative metrics into future strategic planning efforts. In thisposter session, Linda Miller will outline the CUL annual data collection and related processes and welcomediscussing with other conference attendees their data gathering efforts. She will share insights gained onARL-ASSESS

    Linda joined Cornells IRIS Research & Assessment Services in 2003. RASs responsibilities include:coordinating/supporting the annual data collection and the completion of external central surveys; andtaking on, as assigned, ad hoc data collection, manipulation, and presentation projects, and quickenvironmental scans/surveys of peers on library services and organizational issues.



  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


    Monday, September 256:00-7:30

    Poster Preston

    A Time-Budget Study of the George Mason University Libraries Liaison Program

    James E. Nalen

    Time-budget studies of programs and services can form a useful part of a librarys assessment toolkit.Analysis of time-budgets may reveal how staff are responding to changes in the institutional environmentthrough their allocation of time among competing sets of necessary activities. In fiscal year 2006, theGeorge Mason University Libraries employed a time-budget study to analyze the workload of librarianswithin the Libraries liaison program. The liaison program is situated in an environment of rapid growth ingraduate education programs and sponsored research. Twenty librarians were required to report activitiesoccupying each half hour of each day during five weeks that had been selected through systematicsampling. Activities were coded using a category system that had been deductively constructed, withconsiderable input from the librarians themselves.

    The time budget instrument consisted of an Excel worksheet. Excel functions within the worksheetautomatically calculated the percentage of time spent on any given activity. Frequency tables, histogramsand other descriptive statistics were generated from the aggregated data. The data lent themselves tocomparisons between individuals and between sub-groups (e.g. liaisons at a particular library site) and thepopulation as a whole. These statistics and comparisons helped to provide a better understanding of thecomplex nature of liaison work, while also challenging some assumptions about the Libraries liaisonprogram. The time-budget methodology was found to be constrained by the seasonal nature of liaisonlibrarians work, as well as by a certain level of demand characteristic bias. While the time budget surveyrevealed differences in how librarians allocate time to different aspects of the liaison program, themethodology did not help the Libraries to fashion an equitable redistribution of workload amonglibrarians.

    James E. Nalen is the Planning, Assessment & Organizational Development Coordinator at the GeorgeMason University Libraries. He received his MSPA in 1999 from the University of Massachusetts Bostonand MSLIS in 1996 from Simmons College.



  • 7/27/2019 Full Program Final


    Monday, September 256:00-7:30

    Poster Preston

    Methodological Diversity and Assessment Sustainability: Growing the Culture ofAssessment at the University of Washington Libraries

    Maureen Nolan, Jennifer Ward, and Stephanie Wright

    The University of Washington Libraries has been active in library assessment since 1991 and is frequentlymentioned as one of the few academic libraries with a thriving culture of assessment. During the pastfifteen years the assessment program has grown steadily, moving from a one-time large-scale survey to anongoing distributed program that utilizes a variety of methodological tools. Organizationally, assessmentefforts have moved from an ad hoc survey committee to a broadly representative assessment committee,and recently to a central assessment and planning office with 1.5 FTE librarians and assessment effortsconducted throughout the Libraries. The assessment focus has broadened from user needs and satisfactionto evaluation of library services and resources and the value they add to the entire University community.

    This poster highlights different methods such as surveys, usability and data mining, that are used to gaininput and evaluate services. We also show how specific assessment information has been used to improveservices and add value to our customers.

    All three presenters have long been active in UW Libraries Assessment and are members of the UWLibraries Assessment Group.

    Maureen Nolan, Natural Sciences & Resources / Friday Harbor Librarian, UW Libraries;

    Jennifer Ward, Head, Web Services, UW Libraries; and

    Stephanie Wright, Management Information Librarian, UW Libraries.



  • 7/27/2019 Full Program Final


    Monday, September 256:00-7:30

    Poster Preston

    Collecting the *Right* Data for Decision-Making

    Kimberly Burke Sweetman and Marybeth McCartin

    User comments indicated a dissatisfaction with circulation desk hours. By examining circulation dataspecifically (by day of the week and hour of the day) rather than in its aggregate form, we were able toidentify some simple hours adjustments that would better meet user needs.

    Kimberly Burke Sweetman is Head of Access Services at New York University and the author ofManaging Student Assistants, published this year in Neal-Schumans popular How-To-Do-It series

    Marybeth McCartin is Head of Instructional and Undergraduate Services at New York University. Ms.McCartin is winner of the 2005 Merlot Information Technology (Classics) award



  • 7/27/2019 Full Program Final


    Monday, September 256:00-7:30

    Poster Preston

    Creating On-going, Integrated Assessment Efforts in Community College Libraries

    Mark Thompson

    What will be covered: How to scale assessment efforts at a two-year college, so they are successful; How to identify and decided on issues to explore; Using assessment so it leads to results; and Success factors uncovered in four recent assessment projects.

    A concerted effort was made to infuse assessment into many aspects of managing the sole library at a large(15,000 students) community college in northern New Jersey. Given the busy library, with high demand atall service desks, there had been limited staff time to spend on large-scale, formal assessment efforts.Fortunately, however, in the fall of 2004, cost-effective participation in the national LibQUAL surveybecame available through the local consortium. This allowed the library, for the first time, to benchmarkkey issues related to services, resources and staffing. The insights from this study were then used to createan overall assessment plan.

    Priorities were established and then ad-hoc, short-term efforts were applied to various aspects of libraryservices. Locally designed and executed measurement tools were used to gather input on four specific andimportant issues: Noise levels in library areas; Library website ease-of-use; Usage levels of specific electronic resources; and Which topics to cover in library instruction sessions.

    The approach taken was a practical one. The assistant director planned the efforts and ran the researchinternally in consultation with the institutional research director and the librarians. Each effort wasdesigned to be of a reasonable scale and short-term. Each efforts cycle (plan, research, findings, ideas, andchanges) took place in a few months, allowing for results to be realized near-term. This engendered apositive feedback loop and increased participation in the efforts.

    Customer satisfaction and usage levels were measured on each of these topics. The targeted efforts nettedresults for further assessment and, where necessary, changes were made in procedures and approaches.The participation of the staff in using and applying assessment data resulted in significant new levels ofunderstanding of the issues, and in some cases, changes were made in providing library service.Assessment continues in these areas.

    Factors in successful community college library assessment: Start with overall view, then set priorities; Select targeted and scalable efforts; Benchmark: scope out the problems and solutions found at other libraries; Create short feedback loops and then take action; and Involve key staff in their areas.


  • 7/27/2019 Full Program Final


    Mark S. Thompson has 25 years of experience as a librarian spanning many arenas: corporate (Bell System;Dow Jones & Co.), academic (Fairleigh Dickinson Univ.), and public. He also founded his owninformation broker firm (Knowledge Resources). He is currently Assistant Director of the Library atBergen Community College in Paramus, NJ.



  • 7/27/2019 Full Program Final


    Monday, September 256:00-7:30

    Poster Preston

    Bribes Can Work: Ensuring Your Assessment Tool Is Not Ignored

    Luke Vilelle

    Youve got the questions. Theyve got the answers. It seems so simple. Yet no matter how finely tunedyour survey is, how penetrating your focus group questions are, or how enlightening your usability studymight be, you still have to get people to participate. Breaking through todays cluttered world to reach yourtarget population can prove difficult.

    This practical, experience-based poster session will focus on the art of drawing students into your web ofassessment.

    The presenter, who leads the marketing team at the University Libraries at Virginia Tech, will share hisdiscoveries from marketing multiple surveys and usability studies over the past year and a half. Assessmenttools to be highlighted include an online iPod contest that drew over 1,100 entrants into the new librarycatalog, a services survey conducted in the library lobby, and a usability study.

    Based on the complexity of the assessment tool and time required for completion, consider whether agiveaway is needed to entice students to participate. What giveaways attract the most attention? Is it betterto offer a guaranteed cheap giveaway to everybody, or offer only a chance to win a more valuable prize?The poster session will discuss the effectiveness of various prizes, from a free soda to a chance to win aniPod, that have been used at Virginia Tech.

    Picking the lure is only the first step. The presenter will also discuss methods of marketing the assessmenttool and any associated giveaway. If nobody knows you are awarding iPods, then your carefully targeted

    giveaway will be for naught. Sandwich boards, web site publicity, and old-fashioned personal interactionsare a few of the methods to be discussed through this poster session.

    The poster will display its points through graphics and pictures whenever possible, and will use as large atext size as possible. The poster will be accompanied by a handout that summarizes the key points of theposter and includes a short bibliography.

    The presenter hopes attendees will leave with concrete ideas that can be used to increase participation ratesin their next assessment tool.

    Luke Vilelle has been an Outreach Librarian at Virginia Tech since August 2004. He has presented postersessions on marketing (at ACRL 2005) and on assessing virtual reference (at an ALA 2005 preconference),and is part of a panel session on assessment at the Virginia ACRL chapters 2006 spring program.



  • 7/27/2019 Full Program Final


    Monday, September 256:00-7:30

    Poster Preston

    Assessing Library Instruction with Experimental Designs

    Scott White and Remi Castonguay

    The amount of information now available to students has created greater expectations on the part of facultyconcerning student use of information resources. However, it is becoming evident that students are beingoverwhelmed by assignments, the amount of information available and the increasing number of sources inwhich to look.

    At LaGuardia, we offer one-shot library instruction sessions, credit bearing courses, and one-on-oneconsultations to teach students how to conduct meaningful research and prepare well-constructedassignments. We are now planning and conducting research to see if the instruction programs are affectingstudents in a variety of ways.

    Our assessment will focus on two modes of instruction a three-credit course offered in a cluster with anESL course and a introductory social science course; and one-hour instruction sessions, mandated as part ofan English research class, and voluntary for any faculty who wish to bring a class to the library for asession. In Fall 2005, we conducted over 150 one-hour classes.

    A student usability study currently under way at LaGuardia Community Colleges Library MediaResources Center is measuring how students navigate the Librarys website for information. Thisqualitative study, using questionnaires and in-depth student interviews, will present a picture of howstudents use the Library and its virtual resources. It is becoming clear that students who attend LibraryInstruction sessions perform differently on the tests than those who dont. Other research designs arecurrently being developed to help us answer why.

    The Librarys instruction cluster will also be evaluated using a quasi-experimental design. At this point, 50students have been taught in the cluster over the last two years. We are planning to match those studentswith students who did not take the cluster to see if there are differences in student performance, retention,transfer and graduation rates. Faculty believe that the course helps students to perform better in classeswhere research is required, and gives them a better understanding of how the Library can help themsuccessfully complete assignments in support of their class work. Anecdotal evidence suggests that thecredit course, called LRC 102, Information Strategies, has a positive effect on students in terms ofperformance, retention, transfer and graduation. Using student data and conducting interviews with studentswho took the class and those in the control group, we will test the hypothesis that students who completethe LRC course will perform better in the above categories. Using various statistical analyses, we willcontrol for demographics, national origins and student backgrounds.

    Scott White is Head/Access Services and Systems Librarian at LaGuardia Community College (LAGCC),CUNY. He is also an adjunct lecturer at John Jay College. He teaches the Library's three-credit class in athree-course cluster at LAGCC.

    Remi Castonguay is the Coordinator of Instructional Resources Development at LAGCC, developingdigital instructional materials for use in library instruction. He has completed a full web usability study ofthe LaGuardia Library website and is currently evaluating collected data.


  • 7/27/2019 Full Program Final




  • 7/27/2019 Full Program Final


    Monday, September 256:00-7:30

    Poster Preston

    Developing Personas for Evaluating Library Service Needs

    Dan Wilson

    This poster will explain a project being undertaken this year by the Claude Moore Health Sciences Libraryin conjunction with UVa's Public Health Sciences (PHS) department. The goals of the project are to:develop archetypal user "personas" based on primary user group members to assist in the prioritization ofchanges to library services, and to evaluate existing library service to determine the extent to which theseservices are meeting the needs of users. Working collaboratively, PHS will provide the structure andmethodology of the project, while Library faculty will assist with content.

    For our purposes, personas are defined as archetypal users whose characteristics are derived fromquantitative and qualitative data gained primarily from interviews and surveys. Each persona will representa library user group whose members share similar information needs.

    The framework of the project includes a web survey and interviews with representatives of our major usergroups. The web survey was approved by the University of Virginia's Institutional Review Board forHealth Sciences Research (IRB). The web survey was released to targeted user groups in the middle ofApril and will ran until the first of May. Interviewing is currently taking place, and should be finished bythe end of September. Each interview is being recorded, with the interviewee compensated for his/hertime with a free lunch at the University Hospital cafeteria. Following each interview, PHS staff arereviewing, coding and analyzing content. In the Fall, PHS staff will begin developing personas based onthe interviews and results of the survey. A draft of the personas will then be presented to Library facultyfor their review.

    The Library plans to use the personas to evaluate the Library's web page as well as to assess open-enrollment courses offered in the Library. In particular, we are hoping to ascertain the effectiveness of ourcurrent line-up of courses and delivery methods. Once the personas are established, they will be used toassess and evaluate other user services.

    Dan Wilson has been the Assistant Director for Collection Management & Access Services at theUniversity of Virginia Health Sciences Library since 1991. He manages the LibQual survey at the libraryand is active in developing local tools for assessing library services.



  • 7/27/2019 Full Program Final


    Monday, September 256:00-7:30

    Poster Preston

    Perceiving Perception: A Case Study of Undergraduates Perception ofAcademic Libraries

    Steven Yates

    As a part of work funded by an IMLS grant to educate and train 21st century academic librarians, theauthor developed and administered an online survey to undergraduate students completing requirements ina medium-sized universitys Honors College. The survey was aimed to gauge the students perceptions ofthe universitys current library resources.

    They were also asked to project what they would like to see out of the libraries system. These results werethen compared to the OCLC Report titled Perceptions of Libraries and Information Resources todetermine how a case study example of perception in a small student population reflects the trends found inthe report.

    Steven Yates is an IMLS fellow with University of Alabama Libraries and the University of AlabamaSchool of Library and Information Studies. Elizabeth Aversa is Director and Professor at the School ofLibrary and Information Studies at the University of Alabama.



  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final




  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


    Tuesday, September 269:00-10:30

    Parallel 3 #2Moving Assessment Forward

    Salon B

    Evidence Based Library Management A View to the Future.

    Amos Lakos

    In today's rapidly changing information and financial arenas, libraries must continually demonstrate thattheir services have relevance, value and impact on institutional stakeholders and customers. If libraries areto succeed in this new environment, decisions and decision making must be based on effective use of dataand management information.

    This paper is an extension of my earlier work on developing management information services in librariesand on culture of assessment. I will focus my observations on the new opportunities for data analysis,assessment delivery and decision making in libraries. My work is informed by an earlier study done bySusan Beck (Rutgers) and the recent and ongoing assessment work carried out by Steve Hiller (Universityof Washington) and James Self (University of Virginia).

    In an earlier paper, "Creating a Culture of Assessment: A Catalyst for Organizational Change." Portal 4:3,(July 2004), Shelley Phipps (University of Arizona) and I discussed the need for libraries to build a "cultureof assessment" into their larger organizational cultures. We described a new paradigm that encompassesorganizational culture change, utilizing the learning organization, and a systems thinking approach. Wemade the case for transforming library organizational cultures and librarians professional culture in such away as to focus on achieving quality and measurable outcomes for library customers and institutionalstakeholders. We then defined the components of a culture of assessment and the elements required toimplement it in libraries. Additionally, we identified the need for a clearly articulated purpose and strongleadership, the internalization of systems thinking, organizational openness and trust, ongoing opencommunication, and an actively- encouraged climate of risk taking.

    In short, we promulgated a future oriented transformative culture that values and uses assessment tosucceed. We know that cultural change in organizations develops slowly and is a learned process. We alsoacknowledge the need for organizational learning in order to strengthen the foundations supportingtransformational change.

    However, the information environment continues to rapidly evolve and change at a pace that libraries havedifficulty anticipating and responding to. My presentation will examine how developments in theInformation Technology (IT) area, especially the increased dominance of very large networkedinfrastructures and associated services, large scale digitization projects, collaborative frameworks, andeconomic and market trends, impact and will continue to impact our library environment, and how thosedevelopments introduce a variety of opportunities to move libraries and librarians to an evidence basedframework. Libraries will need to seize these opportunities or face the likelihood of becoming relic or

    legacy organizations.

    In addition to drawing upon my earlier work and those of others, I will be conducting a series of directdiscussions with researchers and library leaders which will focus on leaders' decision making activities andthe degree to which they use data in their decision making. My conclusions will be future oriented andpossibly speculative. They will describe a number of possible scenarios for incorporating data and analysisservices to support the need to make evidence-based librarianship the norm.


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final




  • 7/27/2019 Full Program Final


  • 7/27/2019 Full Program Final


    Professor Louise Fluk has been Coordinator of Instruction at LaGuardia Community College for the pasteight years. She was instrumental in developing the three-credit course, Information Strategies, whichintroduces students to the research process. She chairs the Information Literacy Assessment RubricCommittee and is currently supervising the norming of the Colleges Information Literacy and Research

    Assessment Rubric.



  • 7/27/2019 Full Program Final


    Tuesday, September 2611:00-12:00

    Parallel 4 #1Information Literacy II

    Salon A

    The Right Assessment Tool for the Job: Seeking a Match Between Method and Need

    Megan Oakleaf

    In the twenty-first century, all institutions of higher education face calls for accountability. Until recently,the demands faced by other academic units on campus have passed over college and university libraries.Now, academic librarians are increasingly pressured to prove that the resources and services they provideresult in improvement in student learning and development. In answer to calls for accountability, academiclibrarians must demonstrate their contribution to the teaching and learning outcomes of their institutions.One way librarians can achieve this goal is by assessing information literacy instruction on campus.

    When assessing information literacy skills, academic librarians have a variety of outcomes-basedassessment tools from which to choose. The selection of an assessment tool should be based on thepurposes of an assessment situation and the fit between the needs of an assessment situation and thestrengths and weaknesses of individual assessment approaches. In this paper presentation, participants willlearn about the major purposes of assessment and how they impact the criteria used to select an assessmenttool. Among the purposes of assessment that will be discussed are the needs to respond to calls foraccountability, to participate in accreditation processes, to make program improvements, and to enrich thestudent learning experience. Participants will also learn about several criteria useful in selecting anassessment tool including utility, relevance, stakeholder needs, measurability, and cost.

    The presentation will also describe the benefits and limitations of several outcomes-based approaches to theassessment of student learning outcomes. Specifically, participants will learn about assessment tools thatevaluate student learning outcomes taught via information literacy instruction. The theoretical support,benefits, limitations, and relevant research related to the use of surveys, tests, performance assessments,

    and rubrics will be outlined and examples given of each assessment tool.

    Armed with background knowledge and lessons learned from colleagues throughout higher education,academic librarians can embrace the effective and efficient assessment of information literacy instructionefforts on their campuses. This presentation will give participants a jump start toward becomingproficient assessors ready to answer calls for accountability.

    Megan Oakleaf is an Assistant Professor in the School of Information Studies at Syracuse University. Priorto this position, Oakleaf served as Librarian for Instruction and Undergraduate Research at North CarolinaState University. In this role, she trained fellow librarians in instructional theory and methods. She alsoprovided library instruction for the First-Year Writing Program, First-Year College, and Department ofCommunication. Oakleaf completed her dissertation entitled, Assessing Information Literacy Skills: ARubric Approach, at the School of Information and Library Science at the University of North Carolina at

    Chapel Hill. Prior to a career in librarianship, Oakleaf taught advanced composition in public secondaryschools. Her research interests focus on outcomes-based assessment, user education, information services,and digital librarianship.



  • 7/27/2019 Full Program Final


    Tuesday, September 2611:00-12:00

    Parallel 4 #2Evaluation and Assessment Methods

    Salon B

    Choosing the Best Tools for Evaluating Your Library

    Neal Kaske

    We have numerous library assessment tools available to us today. Most of them have proven validity andhigh reliability but do not ensure that we have selected the best tool or tools to build our assessment casegiven our unique circumstances. This paper offers a series of questions which when answered providedirection as to the form of assessment and appropriate measurement tools to employ. The questions are: 1)Why are we measuring? 2) Who will use the results? 3) Do we have baseline data or is this effort toestablish a benchmark? 4) What will this tool or tools tell us and what is the precision of its measurement?5) What new key information will we have from this effort? 6) What are the initial and continuing costsfor using this tool? 7) What are the staffing requirements and what does the staff take away from the effort?8) Will the assessment resonate with and help support the goals of the librarys parent organization? 9)How will the findings be utilized by the librarys parent organization? 10) How might the findings from theassessment be used against the library? Methods for answering these questions are provided, accompaniedby graphic illustrations of the different paths one can take in the choosing the best library assessment toolor tools for your given circumstances.

    Neal Kaske, Director of Statistics and Surveys, US National Commission for Libraries and InformationScience, has been an active researcher in library evaluations for many years. His experience includesacademic library administration, teaching, research, research management, and grant management. Neal'sdoctorate is in industrial engineering - library systems management, masters in librarianship andbaccalaureate in sociology.



  • 7/27/2019 Full Program Final


    Tuesday, September 2611:00-12:00

    Parallel 4 #2Evaluation and Assess