Transcript
Page 1: Faculty Perceptions of ACRL's Information Literacy Competency Standards for Higher Education

SheM

The Journal of

Faculty Perceptions of ACRL’s InformationLiteracy Competency Standards forHigher Education

by Shelley Gullikson

Available online 31 July 2006

Faculty were asked how important for theirstudents the Association of College and

Research Libraries’ Information LiteracyCompetency Standards’ outcomes are, and

when students should display the relevant skills.Faculty believe most of the Standards’outcomes are important but show little

agreement on when students should acquirethem.

lley Gullikson, Information Literacy Coordinator,ount Allison University Libraries, 49 York Street,

Sackville, NB, Canada E4L [email protected].

Academic Librarianship, Volume 32, Number 6, pages 583–592

INTRODUCTION

ACRL’s Information Literacy Competency Standards forHigher Education (hereafter referred to as the Standards) werepublished in 2000 and have had wide acceptance by librariansin colleges and universities in the United States and Canadaand beyond. Many librarians base their information literacy(IL) instruction programs and assessment instruments on theStandards.1 The Project for the Standardized Assessment ofInformation Literacy Skills (Project SAILS) and the Informa-tion and Communication Technology (ICT) Literacy Assess-ment from Educational Testing Services (ETS) are two high-profile assessment instruments developed for large-scaleassessment of IL that have drawn on the Standards. ProjectSAILS has mapped its ‘‘skill sets’’ directly from the Standards’outcomes and objectives,2 whereas the ETS ICT LiteracyAssessment was developed using the Standards but does notset out which specific outcomes are covered.3 Workshopsoffered by ACRL on assessment of IL also use the Standardsas their base.4 The Standards are quite firmly entrenched ininformation literacy instruction and assessment in our collegesand universities.

A common theme in the years before and since the publication ofthe Standards has been integrating information literacy into thecurriculum. In some cases, ‘‘the curriculum’’ refers to the over-arching university curriculum, and IL is integrated in a number ofways—through stand-alone IL courses, through integration in corecourses like first year English, or through participation in First YearExperience programs.5 In some cases, ‘‘the curriculum’’ simplyrefers to the curriculum of a single course, and IL instruction isintegrated into that one course.6 In other cases, ‘‘the curriculum’’refers to an academic department’s curriculum, so that curriculum-integrated IL encompasses several courses throughout a degreeprogram.7 This last sense of curriculum integration seems to bewhat is referred to in ACRL’s Characteristics of Programs ofInformation Literacy that Illustrate Best Practices: A Guideline,which includes an entire category called ‘‘Articulation within theCurriculum,’’8 and this is how curriculum-integrated IL will beinterpreted here.

Instruction librarians are encouraged to work with faculty tointegrate IL into the curriculum and we are also encouraged touse the Standards in developing and assessing IL instruction.

November 2006 583

Page 2: Faculty Perceptions of ACRL's Information Literacy Competency Standards for Higher Education

Taken together, this means we are to work with faculty tointegrate our standardized skills into their curriculum. Inpractice, librarians and faculty will often work together todecide what should be covered and then the librarian mapswhat has been decided onto the Standards document.9

However, as librarians move to use the Standards for assess-ment of IL instruction (through Project SAILS, the ETS ITCtest, or a homegrown model coming out of an ACRLworkshop), the Standards will have to be incorporated at thebeginning of the process in order to ensure students are taughtwhat they are evaluated on. Faculty are therefore more likely tobe working directly with the Standards in these collaborativeprocesses.

‘‘This research seeks to find out what teachingfaculty think about the Standards; how

important are each of those 87 outcomesfor their students?’’

The library instruction literature has not yet examined in anydetail faculty perceptions about the Standards. This researchseeks to find out what teaching faculty think about theStandards; how important are each of those eighty-sevenoutcomes for their students? At what academic level do facultyexpect their students to display the skills in these outcomes?Integration of IL in the curriculum tends to assume that IL willbe taught over the course of a disciplinary program, buildingthe skills over time. Knowing at what point faculty believetheir students require those skills is vital for that process.

LITERATURE REVIEW

There have been several surveys of faculty on the topic oflibrary instruction reported in the literature. Anita Cannonlooked at what factors led to faculty taking advantage of libraryinstruction.10 Gloria J. Leckie and Anne Fullerton lookedspecifically at faculty of Engineering and their attitudes aboutlibrary instruction.11

Annemarie B. Singh recently reported the results of afaculty survey that did include questions about the Standards.12

Faculty in accredited journalism programs were asked if theyjudged their students to be information literate, given theStandards’ definition. Singh does not go into detail about whatpart of the Standards faculty responded to, and the results arenot broken down by specific standards or outcomes.

In one study where faculty attitudes about specific IL skillswere examined, Jacqui Weetman asked faculty, in part, if theythought seven specific IL skills were important.13 The researchwas conducted in the UK and used the headline skills from theSCONUL Seven Pillars model14 rather than the Standards. Theheadline skills under study would be parallel to the five broadstandards; they are not broken down into more specific skills,as in the case of the Standards’ outcomes. The large majorityof faculty in Weetman’s study believed all seven skills to beimportant for their students.

Even when looking beyond faculty surveys, there isrelatively little in the literature about faculty and theStandards. What is there generally discusses librarian–faculty collaborations where IL instruction is developed or

584 The Journal of Academic Librarianship

assessed using the Standards, but few articles go into detailabout how or if the faculty members actually worked withthe Standards themselves. Cecelia Brown and Lee R.Krumholz, a librarian and a faculty member, collaboratedon integrating IL into a microbiology course and used theStandards to assess students’ IL skills.15 The article doesnot specify which outcomes were used for assessment, onlythat ‘‘The librarian assessed their ability to locate, evaluate,and effectively use the information in the papers usingchecklists based on the ACRL standards.’’16 The facultymember assessed students’ ‘‘literacy events’’, but it does notappear that the Standards were used in this assessment.

Madeline Ford and Clay Williams discuss a collaborationbetween a faculty member and a librarian and mention manyspecific outcomes from the Standards.17 However, the out-comes have been correlated with an existing instructiondocument, so it does not appear that the faculty member wasinvolved in actually working with the Standards.

Molly R. Flaspohler reports on a grant project with fivefaculty members to integrate and assess information literacyskills into a first year sequence of courses.18 Faculty wereengaged in a ‘‘lengthy discussion of the ACRL InformationLiteracy Competencies for Higher Education’’19 but theStandards are not specifically mentioned again in the article.The topics taught in the revamped instruction sessions aredescribed (using the library catalogue, evaluating and identify-ing types of periodicals, using a periodical index), but not interms of the Standards or its outcomes.

Lori E. Buchanan, L. Luck DeAnne, and Ted C. Jonesdescribe the collaboration of a communications professor withtwo librarians to integrate the Standards into a communicationscourse.20 Buchanan et al. even get down to the level of themore specific objectives devised by the Instruction Section ofACRL. They report that the professor was very interested instudents learning how to find and evaluate Web content.However, it is not clear whether the communications professorhelped to select the specific IL outcomes and objectives to becovered in the course.

In Ann M. Feigen, Bennett Cherry, and Kathleen Watson’sarticle describing a project where faculty did work with theStandards, faculty were only involved in mapping outcomesand objectives from the Standards to their courses’ existinglearning outcomes.21 Although faculty read the Standards andapplied them in this case, there is no mention of facultyreaction to the Standards themselves.

There is little in the existing literature to give librariansguidance on which parts of the Standards faculty are mostinterested in incorporating into their curricula. Becauselibrarians are encouraged to collaborate with faculty tointegrate IL into the curriculum, some sense of what facultythink about the IL outcomes would be useful. By surveyingfaculty on the specific outcomes in the Standards, this studyseeks to come up with a starting point for discussions withfaculty on which of the Standards’ outcomes are mostimportant to cover, and at what academic level these outcomesare expected of students.

METHODS

The study was carried out in two phases. Phase one involved asurvey of faculty at Mount Allison University, a primarilyundergraduate institution in Sackville, New Brunswick. Oneyear later, phase two expanded to survey faculty members at

Page 3: Faculty Perceptions of ACRL's Information Literacy Competency Standards for Higher Education

Table 1Categorization of Departments into Schools

Schools Departments

Arts Communication; Culture, Heritage,

and Leisure Studies; English; History

Science Biology; Chemistry; Engineering/Math/

Computer Science; Family and

Nutritional Sciences; Physics; Psychology

Social Science Economics; Anthropology/Sociology;

Geography; Political Science; Problem

Centered Studies

Professional Business; Education;

Financial/Information Management

Table 2Outcomes of Highest Average Importance—Overall

Rank Mean N Label Outcome

1 3.9545 66 5.2f Demonstrates an understanding

of what constitutes plagiarism

and does not represent work

attributable to others as his/her

own

2 3.8642 81 3.1a Reads the text and selects

main ideas

3 3.8375 80 3.1b Restates textual concepts in

his/her own words and selects

data accurately

4 3.759 83 3.4c Draws conclusions based upon

information gathered

5 3.7531 81 1.1c Explores general information

sources to increase familiarity

with the topic

6 3.7424 66 4.3d Communicates clearly and with a

style that supports the purposes of

the intended audience

7 3.7342 79 1.1e Identifies key concepts and terms

that describe the information need

8 3.7317 82 3.2c Recognizes prejudice,

deception, or manipulation

9 3.7273 66 5.3a Selects an appropriate

documentation style and uses it

consistently to cite sources

10 3.7077 65 2.5d Records all pertinent citation

information for future reference

other institutions in the region. E-mail messages were sent toheads of library instruction at seven primarily undergraduateuniversities requesting assistance with the dissemination andcollection of the survey. Librarians from four institutions –Cape Breton University in Sydney, Nova Scotia; Mount SaintVincent University in Halifax, Nova Scotia; Saint FrancisXavier University in Antigonish, Nova Scotia; and theUniversity of Prince Edward Island in Charlottetown, PrinceEdward Island – agreed to assist with the study.

In February 2004, surveys were sent through campus mail toall 153 full- and part-time faculty at Mount Allison Universitywho were teaching in the Winter term. One letter of reminderwas sent to the same faculty members two weeks later. Thesurvey instrument listed the eighty-seven outcomes from the fivestandards. For each outcome, respondents were asked howimportant they believed it to be for their students to have thatskill. A four-point scale of ‘‘not important,’’ ‘‘somewhatimportant,’’ ‘‘important,’’ and ‘‘very important’’ was used, withan additional option of ‘‘don’t know.’’ Respondents were alsoasked at what academic level they expected their students tohave that skill. They could select first year of university, secondyear, third year, fourth year, later, in high school, or never. Fivedemographic questions were asked regarding gender, number ofyears taught, full-time or part-time status, and in which School(e.g., Arts, Science) and department they taught.

Thirty-two surveys were returned, for a response rate of 21percent. Informal feedback indicated that the surveys werefound to be very lengthy. It was decided that in the secondphase, the survey would be split in half in the hopes ofimproving the response rate. The surveys were split as follows:the forty-two outcomes from Standards 1 and 3 comprisedSurvey 1, and the forty-five outcomes from Standards 2, 4, and5 comprised Survey 2. Dividing the surveys in this way keptthe outcomes grouped by standard and made the two surveys ofcomparable length. The eight-page, 179-question surveybecame a five-page survey of eighty-nine questions for Survey1 and ninety-five questions for Survey 2. There was alsoinformal feedback on the timing of the survey; some facultyliked that the timing coincided with the week-long term break,others indicated they would have preferred receiving the surveyat the end of term.

In Spring 2005, the modified surveys were sent out. Toaddress the timing concerns brought up in phase one, anattempt was made to send half of the surveys to coincide with

each university’s term break and the remainder at the end ofterm. In two instances this was not possible; in one case a longdelay in receiving approval from the institution’s ResearchEthics Board meant all surveys were sent at the end of term,and in the other case the end-of-term surveys would haveconflicted with that library’s plans for doing their own surveyof faculty so only half of the intended number weredisseminated.

Pre-addressed surveys were mailed to the instructionlibrarians who had agreed to assist with the study, and theysent the surveys to faculty through campus mail. The samplewas drawn from faculty listed in the institutions’ campusdirectory where possible, or else from the academic calendar’sfaculty listing. In order to be comparable to the Mount Allisonsurvey size, sample size for each institution was set at 160.Random sampling was used where the list of faculty numberedgreater than 160, otherwise surveys were sent to all faculty. Inthe end, 146 surveys were sent to faculty at Cape BretonUniversity, only eighty to Mount Saint Vincent (due to anadministrative misunderstanding), 160 to St. Francis Xavier,and 160 to the University of Prince Edward Island. The twoversions of the survey were distributed randomly but evenlyamong the samples (i.e., seventy-three of each to Cape BretonUniversity, forty of each to Mount Saint Vincent, etc.). Facultyreturned their surveys in a sealed envelope to the contact

November 2006 585

Page 4: Faculty Perceptions of ACRL's Information Literacy Competency Standards for Higher Education

Table 3Outcomes of Highest Average Importance—Arts

Rank Mean N Label Outcome

1 4.0000 81 3.1a Reads the text and

selects main ideas

2 3.9615 80 3.4g Selects information

that provides evidence

for the topic

3 3.95 66 4.3d Communicates clearly and

with a style that supports

the purposes of the

intended audience

4 3.95 66 5.2f Demonstrates an understanding

of what constitutes plagiarism

and does not represent work

attributable to others as

his/her own

5 3.9231 82 3.2c Recognizes prejudice, deception,

or manipulation

5 3.9231 83 3.4c Draws conclusions based upon

information gathered

5 3.9231 81 1.1f Recognizes that existing

information can be combined

with original thought,

experimentation, and/or analysis

to produce new information

5 3.9231 82 3.2a Examines and compares

information from various

sources in order to evaluate

reliability, validity, accuracy,

authority, timeliness,

and point of view or bias

9 3.8846 79 1.2e Differentiates between primary

and secondary sources,

recognizing how their

use and importance vary

with each discipline

10 3.8800 80 3.1b Restates textual concepts in

his/her own words and selects

data accurately

Table 4Outcomes of Highest Average Importance—Science

Rank Mean N Label Outcome

1 3.8889 66 5.2f Demonstrates an understanding

of what constitutes plagiarism

and does not represent work

attributable to others as

his/her own

2 3.8846 80 3.1b Restates textual concepts

in his/her own words and

selects data accurately

3 3.7778 65 2.5d Records all pertinent citation

information for future reference

4 3.7692 81 3.1a Reads the text and selects

main ideas

5 3.7407 83 1.1a Confers with instructors

and participates in class

discussions, peer workgroups,

and electronic discussions

to identify a research topic,

or other information need

6 3.7308 81 1.1c Explores general information

sources to increase familiarity

with the topic

7 3.6923 82 3.4e Determines probable accuracy

by questioning the source of

the data, the limitations of

the information gathering

tools or strategies, and the

reasonableness of the

conclusions

8 3.6296 83 3.4c Draws conclusions based

upon information gathered

8 3.6296 82 3.2a Examines and compares

information from various

sources in order to evaluate

reliability, validity, accuracy,

authority, timeliness, and

point of view or bias

8 3.6296 83 3.4f Integrates new information

with previous information or

knowledge

librarian at their university, and these were then returned to theresearcher en masse.

Despite the shorter survey, response rates were lower inthe second phase. Only eighty-five usable surveys werereturned of the 546 sent. After including the thirty-twocompleted surveys from Mount Allison, there were eighty-three responses to Survey 1 (Standards 1 and 3) and sixty-six responses to Survey 2 (Standards 2, 4, and 5). Theoverall response rate for the two rounds of surveys stands at16.7 percent, but because Mount Allison faculty respondedto all eighty-seven outcomes, the response rate for Survey 1was 23.7 percent and 18.9 percent for Survey 2. This is notfar off Weetman’s response rate of 21 percent or Singh’sresponse rate of 22.3 percent, is slightly lower than Leckie

586 The Journal of Academic Librarianship

and Fullerton’s response rate of 28 percent, and is quite abit lower than Cannon’s response rate of 41 percent. Theresults are not broadly generalizable but may be taken as astudy of response to the Standards by faculty in smallundergraduate institutions. Given that there has been noother comparable survey of faculty opinion, these results area starting point for discussion and further research.

RESULTS AND DISCUSSION

Demographics

With one exception, there were no significant differencesbetween the populations answering the two surveys. Survey1 was answered by 55.4 percent male and 44.6 percent

Page 5: Faculty Perceptions of ACRL's Information Literacy Competency Standards for Higher Education

Table 6Outcomes of Lowest Average Importance—Overall

Rank Mean N Label Outcome

78 2.7778 63 4.1d Manipulates digital text,

images, and data, as needed,

transferring them from their

original locations and formats

to a new context

79 2.7719 57 4.3c Incorporates principles of

design and communication

80 2.7231 65 2.3d Uses surveys, letters, interviews,

and other forms of inquiry to

retrieve primary information

81 2.6557 61 5.1b Identifies and discusses issues

related to free vs. fee-based

access to information

82 2.6076 79 3.6c Seeks expert opinion through

a variety of mechanisms

(e.g., interviews, e-mail, listservs)

83 2.5231 65 4.3b Uses a range of information

technology applications in

creating the product or

performance

84 2.4737 76 3.6b Participates in class-sponsored

electronic communication

forums designed to encourage

discourse on the topic

(e.g., e-mail, bulletin boards,

chat rooms)

85 2.4615 78 1.3b Considers the feasibility

of acquiring a new language

or skill (e.g., foreign or

discipline-based) in order to

gather needed information

and to understand its context

86 2.3939 66 4.2a Maintains a journal or log

of activities related to the

information seeking, evaluating,

and communicating process

87 2.3443 61 5.2a Participates in electronic

discussions following

accepted practices

(e.g., ‘‘Netiquette’’)

Table 5Outcomes of Highest Average Importance—Social

Science

Rank Mean N Label Outcome

1 4 66 5.2f Demonstrates an understanding

of what constitutes plagiarism

and does not represent work

attributable to others as

his/her own

2 3.8889 66 4.1a Organizes the content in a

manner that supports the

purposes and format of the

product or performance

(e.g., outlines, drafts, storyboards)

3 3.8824 65 4.1c Integrates the new and prior

information, including

quotations and paraphrasings,

in a manner that supports

the purposes of the

product or performance

4 3.8571 82 3.2c Recognizes prejudice,

deception, or manipulation

4 3.8571 83 3.2b Analyzes the structure and

logic of supporting arguments

or methods

6 3.8500 81 3.1a Reads the text and

selects main ideas

7 3.8333 64 2.2a Develops a research

plan appropriate to

the investigative method

8 3.8095 82 1.1b Develops a thesis

statement and formulates

questions based on the

information need

8 3.8095 81 1.1c Explores general information

sources to increase familiarity

with the topic

8 3.8095 79 1.1d Defines or modifies the

information need to

achieve a manageable focus

female faculty members, whereas Survey 2 was answered by63.6 percent male and 36.4 percent female faculty members.Canadian universities reported 33.9 percent female faculty in2003;22 however, the five universities under study wereabove the national average with percentage of female facultyin 2003/2004,23 so there is only a slight overrepresentationof female faculty in this survey. Overall, over 90 percent ofthe respondents were full-time faculty, not surprising becausepart-time faculty are not often listed in academic calendarsand university directories from which the sample was drawn.Teaching experience varied, with 21.4 percent teaching 0–5years, 17.1 percent teaching six to ten years, 18.8 percentteaching eleven to fifteen years, 12.8 percent teaching sixteento twenty years, and 29.9 percent with more than twentyyears’ university teaching experience. Although there are no

national statistics on teaching experience per se, these resultsare comparable to the breakdown of faculty by age24 and byrank25 in 2003/2004. Even though this was not a randomsample, the respondents generally conform to national facultyprofiles.

Respondents were from a wide range of departments. Theresponse rate does not allow meaningful data to be drawn froma departmental breakdown, but respondents can be groupedinto those teaching in schools of Arts (31.9 percent), Science(33.6 percent), and Social Science (22.1 percent), with anadditional 12.4 percent teaching in professional schools such as

November 2006 587

Page 6: Faculty Perceptions of ACRL's Information Literacy Competency Standards for Higher Education

business administration and education. Table 1 shows howdepartments have been categorized.

Outcomes of High Importance

In this section, outcomes are numbered according to theStandards’ system, so that 1.1a refers to Standard 1, Perform-ance Indicator 1, Outcome a. Average importance has beendetermined by a mean of Likert scale scores, where 1 was ‘‘notimportant’’ and 4 was ‘‘very important.’’ These averages havebeen placed in order from highest to lowest in order todetermine the rankings referred to in this section. It should benoted that faculty were not asked to rank the outcomes; theyrated the outcomes and the average scores were then put inrank order by the researcher.

‘‘...faculty have rated most of the IL outcomesquite high in importance. Sixty-one of the 87

outcomes rate higher than 3.25/4.’’

The data are encouraging, as faculty have rated most ofthe IL outcomes quite high in importance. Sixty-one of theeighty-seven outcomes rate higher than 3.25/4. Thirty-fiveoutcomes have an average importance of 3.5/4 or better.Fifty-five of the outcomes were rated as very important tothe fiftieth percentile. Given the high ratings, one mightpresume that some respondents simply filled out ‘‘veryimportant’’ for every item, but only two surveys show thatpattern. The average scores show a near-normal distributionbut skewed positive. This is perhaps not surprising, as thosewho felt IL outcomes to be most important would be more

TableAcademic Level Expected

Outcome High School (%) 1st Year (%) 2nd Year (%)

5.2f 27.7 64.6 4.6

3.1a 31.3 50.6 13.3

3.1b 18.3 54.9 15.9

3.4c 17.1 36.6 26.8

1.1c 26.5 45.8 18.1

4.3d 9.1 30.3 27.3

1.1e 7.5 47.5 25.0

3.2c 4.9 29.3 35.4

5.3a 10.6 47.0 27.3

2.5d 1.5 34.8 31.8

5.2f: Demonstrates an understanding of what constitutes plagiarism and does not represent work3.1a: Reads the text and selects main ideas.

3.1b: Restates textual concepts in his/her own words and selects data accurately.

3.4c: Draws conclusions based upon information gathered.1.1c: Explores general information sources to increase familiarity with the topic.

4.3d: Communicates clearly and with a style that supports the purposes of the intended audience1.1e: Identifies key concepts and terms that describe the information need.

3.2c: Recognizes prejudice, deception, or manipulation.

5.3a: Selects an appropriate documentation style and uses it consistently to cite sources.2.5d: Records all pertinent citation information for future reference.

588 The Journal of Academic Librarianship

likely to complete the survey. Because this population islikely to mirror the faculty who are requesting ILinstruction, the positive skewing of results should not betroublesome.

Table 2 summarizes the ten ACRL IL outcomes that facultyrated the highest in terms of importance. Half of these wererated at the highest importance (4) at the twentieth percentile—5.2f, 3.1a, 3.1b, 3.4c, and 4.3d.

There were differences in the highest rated outcomesacross schools. Tables 3, 4, and 5 show the top ten outcomesfor Arts, Science, and Social Science, respectively. Therewere not enough responses from faculty in professionalschools to warrant a breakdown. It is interesting to notethese differences, but few were statistically significant. Inmost cases, the ten highest rated outcomes were highly ratedby all faculty. The most striking difference was for outcome1.2e (differentiating between primary and secondary sources),which was ranked ninth by Arts faculty, thirty-eighth byScience faculty, and forty-sixth by Social Science faculty. Thedifference in average importance was highly significantbetween Arts and Science faculty ( p b0.002). This is notunexpected, as identification of primary sources is not a skillthat Science students tend to require. The difference betweenArts and Social Science faculty was also statisticallysignificant, but not to as high a degree (p b0.05); althoughthe rankings are further apart, their average scores were not asfar apart (Arts faculty average: 3.8846, Social Science facultyaverage: 3.4762, Science faculty average: 3.36). Arts andSocial Science faculty also rated outcome 3.2c (recognizingprejudice, manipulation, and deception) significantly higher(p b0.005 and p b0.05, respectively) than Science faculty, butthis outcome still made the top twenty outcomes for Sciencefaculty, so although statistically significant, it is not verysignificant in a practical sense.

7for Top Ten Outcomes

3rd Year (%) 4th Year (%) Later (%) Never (%

1.5 – – 1.5

3.6 – 1.2 –

8.5 – 2.4 –

13.4 3.7 2.4 –

7.2 1.2 1.2 –

27.3 3.0 – 3.0

16.3 2.5 1.3 –

18.3 11.0 1.2 –

13.6 – – 1.5

15.2 7.6 – 1.5

attributable to others as his/her own.

.

)

Page 7: Faculty Perceptions of ACRL's Information Literacy Competency Standards for Higher Education

Table 8Top Ten Librarian-Responsible Outcomes

RankOverallRank N Label Outcome

1 5 81 1.1c Explores general information

sources to increase familiarity

with the topic

2 7 79 1.1e Identifies key concepts and

terms that describe the

information need

3 8 82 3.2c Recognizes prejudice,

deception, or manipulation

4 9 5.3a Selects an appropriate

documentation style and

uses it consistently

to cite sources

5 11 82 3.2a Examines and compares

information from various

sources in order to

evaluate reliability, validity,

accuracy, authority,

timeliness, and point of

view or bias

6 14 79 1.1d Defines or modifies

the information need

to achieve a

manageable focus

7 17 64 2.2a Develops a research

plan appropriate to

the investigative method

8 21 80 3.4g Selects information

that provides evidence

for the topic

9 22 81 1.2d Identifies the purpose and

audience of potential

resources (e.g., popular vs.

scholarly, current vs. historical)

10 27 80 1.4a Reviews the initial information

need to clarify, revise, or refine

the question

Outcomes from all five standards are represented in the tenhighest rated outcomes. When looking at the thirty highestrated outcomes, only two from Standard 2 are there—2.5d(records pertinent citation information for future reference)and 2.2a (develops a research plan appropriate to theinvestigative method). The other four standards are morewidely represented. This is interesting, given that Standard 2is where many ‘‘traditional’’ library skills reside (e.g.,identifying keywords and synonyms, selecting controlledvocabulary, constructing searches using Boolean, usingclassification schemes). The outcomes representing theseskills are ranked, respectively, fortieth, fifty-sixth, fifty-fourth, and sixty-second.

Outcomes of Low Importance

Table 6 summarizes the ten ACRL IL outcomes that facultyrated the lowest in terms of importance. As with the highestrated outcomes, all five standards are represented in theselowest rated. Differences across schools were minor. Thelowest rated outcomes overall were also rated low by Arts,Science, and Social Science faculty. Again, there were notenough responses from faculty in professional schools towarrant a breakdown.

‘‘It is encouraging that so few outcomes havebeen deemed ‘‘not important’’ by even

moderate numbers of faculty in this survey.’’

The majority of outcomes (67.8 percent) had a range ofratings, from not important to very important by facultymembers. However, only thirteen were rated as not importantat the tenth percentile of respondents, five at the twentiethpercentile and only one at the thirtieth percentile. It isencouraging that so few outcomes have been deemed ‘‘notimportant’’ by even moderate numbers of faculty in thissurvey.

Non-response

For more than 80 percent of the outcomes, the questionabout importance was either left blank or answered as ‘‘don’tknow’’ by at least one respondent. The most commonoutcomes with such a response were 2.2c (‘‘selects controlledvocabulary specific to the discipline or information retrievalsource’’—15.2 percent unanswered or ‘‘don’t know’’), 4.3c(‘‘incorporates principles of design and communication’’—13.6 percent unanswered or ‘‘don’t know’’), and 5.2b (‘‘usesapproved passwords and other forms of ID for access toinformation resources’’—10.9 percent unanswered or ‘‘don’tknow’’). All of these fall in the middle to low end ofaverage importance, even when non-response is taken intoaccount.

A few surveys were returned with marginal notesregarding the language used in the outcomes. There werecomments about vagueness, usage of certain words, andcomplexity of the language. One survey was returned withonly two of five pages completed and a note on the frontpage saying in part ‘‘too damned complex.’’ It is reasonableto assume that other surveys were simply thrown out

unanswered for the same reason, thus accounting, in part,for the low response rate.

Academic Level at which Faculty Expect Students to ShowOutcomes

Table 7 shows the top ten outcomes with the academiclevels at which faculty expect students to show them. Theresponse rate was too low to generate any meaningful data byschools. For six of the ten outcomes with the highest averageimportance, more than half of the surveyed faculty expectstudents to display these outcomes in their first year ofuniversity or in high school. Less than half of the respondentsagree on an academic level for the remainder. In only one case(3.2c—recognizing prejudice, deception, or manipulation) do

November 2006 589

Page 8: Faculty Perceptions of ACRL's Information Literacy Competency Standards for Higher Education

Table 9Academic Level Expected for Top Ten Librarian-Responsible Outcomes

Outcome High School (%) 1st Year (%) 2nd Year (%) 3rd Year (%) 4th Year (%) Later (%) Never (%)

1.1c 26.5 45.8 18.1 7.2 1.2 1.2 –

1.1e 7.5 47.5 25.0 16.3 2.5 1.3 –

3.2c 4.9 29.3 35.4 18.3 11.0 1.2 –

5.3a 10.6 47.0 27.3 13.6 – – 1.5

3.2a 2.4 24.4 36.6 26.8 6.1 2.4 1.2

1.1d 6.3 39.2 26.6 21.5 5.1 1.3 –

2.2a 1.6 15.6 28.1 37.5 15.6 1.6 –

3.4g 12.5 40.0 30.0 10.0 5.0 1.3 1.3

1.2d 4.9 51.9 17.3 14.8 8.6 2.5 –

1.4a 2.5 35.8 31.3 21.0 7.4 1.2 –

1.1c: Explores general information sources to increase familiarity with the topic.

1.1e: Identifies key concepts and terms that describe the information need.

3.2c: Recognizes prejudice, deception, or manipulation.5.3a: Selects an appropriate documentation style and uses it consistently to cite sources.

3.2a: Examines and compares information from various sources in order to evaluate reliability, validity, accuracy, authority, timeliness, and point of view or bias.1.1d: Defines or modifies the information need to achieve a manageable focus.

2.2a: Develops a research plan appropriate to the investigative method.

3.4g: Selects information that provides evidence for the topic.1.2d: Identifies the purpose and audience of potential resources (e.g., popular vs. scholarly, current vs. historical).

1.4a: Reviews the initial information need to clarify, revise, or refine the question.

more faculty expect to see the outcome in second year than infirst year or earlier.

‘‘For six of the ten outcomes with the highestaverage importance, more than half of the

surveyed faculty expect students to display theseoutcomes in their first year of university or in

high school.’’

If faculty expect students to display these outcomes in firstyear, IL instruction must happen early in that first year.Certainly this follows traditional patterns of instruction aimedat first year students but does not mesh well with the approachof integrating IL instruction throughout the three or four yearsof a departmental curriculum.

Instructional Responsibility

In the ACRL document ‘‘Objectives for InformationLiteracy Instruction: A Model Statement for AcademicLibrarians,’’ the IL outcomes are ‘‘tagged’’ as being ‘‘primarilylibrarians’ responsibility’’ or with ‘‘responsibility shared bylibrarians and the course instructor.’’ Only those outcomes‘‘that could best be addressed by the librarian or by thelibrarian and course instructor collaboratively’’26 are includedin the document. Just nine of the eighty-seven outcomes aretagged as being primarily librarians’ responsibility, and anadditional twenty-five are of shared responsibility. Theremaining fifty-three outcomes were therefore considered bestaddressed by the course instructor alone. Certainly some of this

590 The Journal of Academic Librarianship

classification can be debated—for example, outcome 5.2 gregarding plagiarism is not tagged with any librarian respon-sibility even though many librarians provide instruction onplagiarism. The ACRL document notes that ‘‘local preferencesmay vary’’; however, it is interesting to look at the current datain light of these categorizations.

Table 8 shows the ten outcomes with the highest averageimportance when the data are limited to only those thirty-fouroutcomes that have either full or partial librarian responsibility.The table also shows how each outcome ranks in all eighty-seven outcomes. There are four librarian-responsible outcomesin the top ten, and only one librarian-responsible outcome inthe bottom ten. Two of the thirty-four librarian-responsibleoutcomes (or 5.9 percent) have average importance ratings ofless than three, whereas fourteen of the fifty-three course-instructor-responsible outcomes (26.4 percent) have averageimportance ratings of less than three. Although these dis-tinctions may be arbitrary, it is interesting to note that only twolibrarian-responsible outcomes have an average rating of lessthan ‘‘Important,’’ whereas more than a quarter of the course-instructor-responsible outcomes fall into that category.

Table 9 shows the top ten outcomes that have full or partiallibrarian responsibility with the academic levels at whichfaculty expect students to show them. For five of the tenoutcomes, more than half of the faculty expect their students todisplay them in their first year of university or in high school.In two cases (3.2c—recognizing prejudice, deception, ormanipulation, and 3.2a—examining information from varioussources in order to evaluate), more faculty expect the outcomesin second year than in first year or earlier. In one case (2.2a—developing a research plan), more faculty expect the outcomein third year than at any other time. It seems these librarian-responsible outcomes are more likely to be integrated acrossthe full range of years in a curriculum.

Page 9: Faculty Perceptions of ACRL's Information Literacy Competency Standards for Higher Education

CONCLUSION

In the introduction to the Standards, librarians are cautionedthat these are not the be-all and end-all of information literacy;that the Standards may need to be tailored to specificinstitutions and specific disciplines. The introduction in 2005of the Information Literacy Competency Standards forScience27 indicates that this is taking place, but to many inthe higher education library instruction community, the Stand-ards are used as is. Modifying them to suit one’s owninstitution, let alone the disciplines within it, would take aluxury of time that most librarians do not have. Even withoutlooking at modification, when faced with the large number ofoutcomes it is often difficult to simply prioritize what should betaught and when. This study was an attempt to look at whatfaculty think about this, not in a personalized way – What dofaculty at my institution think? – but in a more standardizedway – What do faculty generally think? The results may givelibrarians a starting point when approaching faculty, or whensetting priorities in general IL programs. At the very least, itpoints out a few of the eighty-seven outcomes that may notneed to be covered and emphasizes the ones that do.

As has been stated, the results of this study may not bewidely generalizable, but there are a few points that may betaken away.

There was not a lot of agreement on the academic level atwhich IL outcomes are expected by faculty. Where there wasagreement, it tended to be for outcomes expected in the first yearof university or earlier. This certainly backs up the focus of manyin the instruction community on targeting first year students,whether through First Year Experience Programs, first yearcomposition courses, first year English courses, or other largefirst year classes. It may be slightly disheartening for those whowould like to move beyond this and into progressivelyintegrating more advanced skills into four-year programs, butthe data do not indicate a need to abandon upper-level instructionfor first-year programs. First of all, the lack of general agreementon academic levels is a good indication that support does existfor teaching students at higher levels. Secondly, the IL outcomesare quite vague; a single outcome could be broken down intomany specific objectives that range in difficulty and thereforecould be taught over a few years. Faculty responding to thesurvey may have assumed a basic skill level for each outcome.More in-depth research would be required to look into howfaculty interpret the Standards’ outcomes.

‘‘There was not a lot of agreement on theacademic level at which IL outcomes are

expected by faculty. Where there wasagreement, it tended to be for outcomes

expected in the first year of university or earlier.’’

Faculty responding to the survey had difficulty with thelanguage of the IL outcomes. Several surveys were returnedwith marginal notes asking for clarification, complaining ofvagueness, or just decrying the language used (one personcircled every instance of the phrase ‘‘information literate’’ andwrote next to one, ‘‘a horrible term!’’). Informally at my own

campus, I was told that there seemed to be a lot of repetition,and that the wording was quite confusing. It would be useful toconduct focus groups with faculty to see how they areinterpreting the outcomes; it could well be that their importancescores would change if it was explained what the outcomesreally mean.

Further research on faculty perceptions of the InformationLiteracy Competency Standards for Higher Education wouldbe useful in developing a more valid and reliable map of the ILoutcomes faculty identify as most (and least) important. Moredata would also allow a breakdown by department—Do allArts faculty feel strongly about primary vs. secondary docu-ments, or just History faculty? It would also be useful to get asense of what IL skills faculty see as important, withoutmaking reference to the Standards. By basing our IL programsand assessment on the Standards perhaps, we are missing outon some vital skills.

As was said at the outset, the results of this research aremeant to be a starting point for discussions with faculty.Perhaps they could also be a starting point for discussionswith other instruction librarians in deciding which of theoutcomes we think are important and at what academic levelwe expect our students to display them. Six years after theirpublication, perhaps it is time to revisit the Standards, and –together with our faculty – draft an updated version thattakes into account the experiences that both librarians andfaculty have had with implementing and integrating themthus far.

NOTES AND REFERENCES

1. For examples, see Vogel Library at Wartburg College, CurriculumMap of the Information Literacy Competency Standards for HigherEducation. Online. (2003) Available: http://public.wartburg.edu/library/infolit/currmap.html;University of Connecticut Libraries, Information Literacy: Pro-gram and Desired Outcomes. Online. (2005) Available: http://webapps.lib.uconn.edu/Outcomes;and University of Louisville Libraries, Integration of InformationLiteracy at UofL: Working Document of the University LibrariesInformation Literacy Team. Online. (2001) Available: http://www.louisville.edu/infoliteracy/infolitoutcomes.htm.

2. Project SAILS, Project SAILS Skill Sets. Online. (2005) Avail-able: http://sails.lms.kent.edu/plans/skillsets.html.

3. Educational Testing Service, Succeeding in the 21st Century: WhatHigher Education Must Do to Address the Gap in Information andCommunication Technology Proficiencies. Online. (2003) Available:http://www.ets.org/ictliteracy/ICTwhitepaperfinal.pdf.

4. Association of College and Research Libraries e-Learning Seminars,Assessing Student Learning Outcomes. Online. (2005) Available:http://www.ala.org/ala/acrl/acrlproftools/assessingstudent.htm;and Association of College and Research Libraries e-Learning Se-minars, Information Literacy and Assessment. Online. (2005) Avail-able: http://www.ala.org/ala/acrl/acrlproftools/informationliteracy.htm.

5. For example, Marjorie M. Warmkessel and Joseph M. McCade,‘‘Integrating Information Literacy Into the Curriculum,’’ ResearchStrategies 15 (Spring 1997): 80–88;Patrick Ragains, ‘‘Infusing Information Literacy Into the CoreCurriculum: A Pilot Project at the University of Nevada, Reno,’’portal: Libraries and the Academy 1 (Oct. 2001): 391–407;Joan Parks and Dana Hendrix, ‘‘Integrating Library InstructionInto the Curriculum Through Freshman Symposium,’’ ReferenceServices Review 21 (Spring 1996): 65–71.

6. For example, Cecelia Brown and Lee R. Krumholz, ‘‘Integrating

November 2006 591

Page 10: Faculty Perceptions of ACRL's Information Literacy Competency Standards for Higher Education

Information Literacy Into the Science Curriculum,’’ College &Research Libraries 63 (Mar. 2002): 111–23;Lyn Thaxton, Mary Beth Faccioli, and Anne Page Mosby,‘‘Leveraging Collaboration for Information Literacy in Psychol-ogy,’’ Reference Services Review 32, no.2 (2004): 185–189;Margaret C. Wallace, Allison Shorten, Patrick A. Crookes,Catriona McGurk and Chris Brewer, ‘‘Integrating InformationLiteracies Into an Undergraduate Nursing Programme,’’ NurseEducation Today 19 (Feb. 1999): 136–41.

7. For example, Beth Christensen, ‘‘Warp Weft, and Waffle: WeavingInformation Literacy Into an Undergraduate Music Curriculum,’’Notes 60 (Mar. 2004): 616–31;Barbara J. D’Angelo and Barry M. Maid, ‘‘Moving BeyondDefinitions: Implementing Information Literacy Across the Curric-ulum,’’ Journal of Academic Librarianship 30 (May 2004): 212–17;AnnM. Feigen, Bennett Cherry, and KathleenWatson, ‘‘Reflectionson Collaboration: Learning Outcomes and Information LiteracyAssessment in the Business Curriculum,’’ Reference ServicesReview 30, no.4 (2002): 307–318.

8. Association of College and Research Libraries, Characteristicsof Programs of Information Literacy that Illustrate BestPractices: A Guideline. Online. (2003) Available: http://www.ala.org/ala/acrl/acrlstandards/characteristics.htm.

9. For example, Madeline Ford and Clay Williams, ‘‘Research andWriting in Sociology: A Collaboration Between ClassroomInstructor and Librarian,’’ Public Services Quarterly 1, no.3(2002): 37–49; Feigen, Cherry, and Watson, ‘‘Reflections onCollaboration.’’

10. Anita Cannon, ‘‘Faculty Survey on Library Research Instruction,’’RQ 33 (Summer 1994): 524–541.

11. Gloria J. Leckie & Anne Fullerton, ‘‘Information Literacy inScience and Engineering Undergraduate Education: FacultyAttitudes and Pedagogical Practices,’’ College & ResearchLibraries 60 (Jan. 1999): 9–29.

12. Annmarie B. Singh, ‘‘A Report on Faculty Perceptions ofStudents’ Information Literacy Competencies in Journalism and

592 The Journal of Academic Librarianship

Mass Communication Programs: The ACEJMC Survey,’’ College& Research Libraries 66 (July 2005): 294–310.

13. Jacqui Weetman, ‘‘Osmosis—Does It Work for the Developmentof Information Literacy?’’ Journal of Academic Librarianship 31(Sept. 2005): 456–460.

14. The Society of College, National & University LibrariesAdvisory Committee on Information Literacy, Information Skillsin Higher Education: A SCONUL Position Paper. Online.(1999) Available: http://www.sconul.ac.uk/activities/inf_lit/papers/Seven_pillars2.pdf.

15. Brown and Krumholz, ‘‘Integrating Information Literacy.’’16. Ibid., 113.17. Ford and Williams, ‘‘Research and Writing in Sociology.’’18. Mooly R. Flaspohler, ‘‘Information Literacy Program Assessment:

One Small College Takes the Plunge,’’ Reference Services Review31, no. 2 (2003): 129–140.

19. Ibid., p. 132.20. Lori E. Buchanan, L. Luck DeAnne and Ted C. Jones, ‘‘Integrating

Information Literacy Into the Virtual University: A course model,’’Library Trends 51 (Fall 2002): 144–166.

21. Feigen, Cherry, and Watson, ‘‘Reflections on Collaboration.’’22. Canadian Association of University Teachers, CAUT Almanac of

Post-Secondary Education in Canada 2006. Online. (2006)Available: http://www.caut.ca/en/publications/almanac/default.asp,Section 8, Table 8.3.

23. Ibid., Section 2, Table 2.12.24. Ibid., Section 2, Table 2.10.25. Ibid., Section 2, Table 2.11.26. Association of College and Research Libraries Instruction Section,

Objectives for Information Literacy Instruction: AModel Statementfor Academic Librarians. Online. (2001) Available: http://www.ala.org/ala/acrl/acrlstandards/objectivesinformation.htm.

27. Association of College and Research Libraries, Information LiteracyStandards for Science and Technology (DRAFT). Online. (2005)Available: http://www.ala.org/ala/acrl/acrlstandards/infolitscitech.htm.


Top Related