the quality indicators for an information retrieval system: user's perspective

7
The Quality Indicators for an Information Retrieval System: User's Perspective Roslina Othman Faculty ofICT, International Islamic University Malaysia Abstract An IR system must be designed to satisfy a user's information need. To achieve quality results, the system must help users to construct quality searches. Thus the aim of this project was to identify a set of quality criteria and indicators to evaluate an IR system. Survey, observation, and self-reporting logs were used to compile a list of quality criteria and their indicators, and expected features from 250 users. The findings revealed quality criteria that include content, retrieval features (e.g. term density, term boosting, and fuzzy searching), user interface, thesaurus-enhanced search (e.g. visualization of search results), help and feedback mechanism, and administrative considerations. 1. Introduction An information retrieval (IR) system offers features and interfaces to help users find relevant information. However, users faced difficulties in finding the right information in an IR system [1]. The difficulties were related to application of retrieval features and search formulation. IR systems share many common features like Boolean operators, word/phrase search, proximity, stemming, truncation, and wildcard; however, the interpretation and implementation vary from one system to another [1]. Users struggled to construct search terms and include term variations [2]. Yet, these users have undergone training in searching. 1.1. Measures for quality searches In estimating the quality of searches conducted by users, measures like recall, precision, similarity, user effort, and usability have been applied to see if an IR system were able to satisfy a user's information need. 1.1.1. Recall, precision, and similarity Recall indicates a system's ability to retrieve all relevant items in the collection. Precision measures a system's ability to reject irrelevant items. Similarity [3] shows a system's ability to retrieve items having a close match to known items. In this case, the known items are representation of the user's information need. Recall, precision, and similarity yield levels of retrieval performance based on search results, which then prompt the evaluator to examine search strategies, retrieval features, and user-interfaces to identify known causes for such level of performance. Recall, precision, and similarity reveal success or failures related to search strategies and indexing languages. 1.1.2. Data Retrieval and Information Retrieval Search results are ranked according to system relevance, while these measures are computed based on user's relevance. The levels of performance produced by recall, precision and similarity indicate the extent to which system relevance match and mismatch user's relevance. Often the system's relevance did not correspond with user's relevance [4] The retrieval features and user interfaces are designed to help users construct a query that match the index terms assigned documents in the collection. A document is retrieved by the system when the index term matched with the query. However, a user may not agree that the document is relevant to his/her information need. The system in matching the query and the index terms is actually providing data retrieval. The system matched index terms and query, and not necessarily documents with information needs. Matching documents with information needs is information retrieval. Users expected that an IR system must provide the information or data with semantics that they have in mind. To achieve an effective match, users must be able to translate information needs into search terms, apply the appropriate retrieval features and user interfaces, and construct effective search strategies. This leads to the inclusion of measures that are concerned with user's searching ability. 1.1.3. User effort and usability Measures that focus on user's searching ability include user effort and usability. User effort [5] measures the time a user spent on searching in an IR system to fulfill his/her information need. User effort includes time spent learning to use the system since the user learns when he/she searches in the system. Usability [6] is concerned with the use of interfaces and retrieval features to achieve desired search results. Usability is a measure that focuses on both the search formulation and search results. Users need to perform effective searches in a quality IR system. Quality is defined here as finding the right information or information that satisfies the user's information need. 0-7803-9521-2/06/$20.00 §2006 IEEE. 1 738

Upload: iium

Post on 22-Jan-2023

0 views

Category:

Documents


0 download

TRANSCRIPT

The Quality Indicators for an Information Retrieval System: User's Perspective

Roslina OthmanFaculty ofICT, International Islamic University Malaysia

Abstract

An IR system must be designed to satisfy a user'sinformation need. To achieve quality results, the systemmust help users to construct quality searches. Thus theaim ofthis project was to identify a set ofquality criteriaand indicators to evaluate an IR system. Survey,observation, and self-reporting logs were used tocompile a list of quality criteria and their indicators,and expected features from 250 users. The findingsrevealed quality criteria that include content, retrievalfeatures (e.g. term density, term boosting, and fuzzysearching), user interface, thesaurus-enhanced search(e.g. visualization of search results), help andfeedbackmechanism, and administrative considerations.

1. Introduction

An information retrieval (IR) system offers featuresand interfaces to help users find relevant information.However, users faced difficulties in finding the rightinformation in an IR system [1]. The difficulties wererelated to application of retrieval features and searchformulation. IR systems share many common featureslike Boolean operators, word/phrase search, proximity,stemming, truncation, and wildcard; however, theinterpretation and implementation vary from one systemto another [1]. Users struggled to construct searchterms and include term variations [2]. Yet, these usershave undergone training in searching.

1.1. Measures for quality searches

In estimating the quality of searches conducted byusers, measures like recall, precision, similarity, usereffort, and usability have been applied to see if an IRsystem were able to satisfy a user's information need.

1.1.1. Recall, precision, and similarity

Recall indicates a system's ability to retrieve allrelevant items in the collection. Precision measures asystem's ability to reject irrelevant items. Similarity [3]shows a system's ability to retrieve items having a closematch to known items. In this case, the known itemsare representation of the user's information need.

Recall, precision, and similarity yield levels ofretrieval performance based on search results, whichthen prompt the evaluator to examine search strategies,retrieval features, and user-interfaces to identify known

causes for such level of performance. Recall, precision,and similarity reveal success or failures related to searchstrategies and indexing languages.

1.1.2. Data Retrieval and Information Retrieval

Search results are ranked according to systemrelevance, while these measures are computed based onuser's relevance. The levels of performance producedby recall, precision and similarity indicate the extent towhich system relevance match and mismatch user'srelevance. Often the system's relevance did notcorrespond with user's relevance [4]

The retrieval features and user interfaces are designedto help users construct a query that match the indexterms assigned documents in the collection. Adocument is retrieved by the system when the index termmatched with the query. However, a user may notagree that the document is relevant to his/herinformation need.

The system in matching the query and the index termsis actually providing data retrieval. The systemmatched index terms and query, and not necessarilydocuments with information needs. Matchingdocuments with information needs is informationretrieval.

Users expected that an IR system must provide theinformation or data with semantics that they have inmind. To achieve an effective match, users must beable to translate information needs into search terms,apply the appropriate retrieval features and userinterfaces, and construct effective search strategies. Thisleads to the inclusion of measures that are concernedwith user's searching ability.

1.1.3. User effort and usability

Measures that focus on user's searching abilityinclude user effort and usability. User effort [5]measures the time a user spent on searching in an IRsystem to fulfill his/her information need. User effortincludes time spent learning to use the system since theuser learns when he/she searches in the system.

Usability [6] is concerned with the use of interfacesand retrieval features to achieve desired search results.Usability is a measure that focuses on both the searchformulation and search results. Users need to performeffective searches in a quality IR system. Quality isdefined here as finding the right information orinformation that satisfies the user's information need.

0-7803-9521-2/06/$20.00 §2006 IEEE. 1 738

1.2. Aim of this project

Thus the aim of this project was to identify a set ofcriteria that could be applied to evaluate the quality of aninformation retrieval (IR) system. The objectives were:to compile quality criteria and indicators identified fromreview of articles and to gather a set of quality criteriaand their indicators from users' perspectives. Theexpected outcome of this project is a checklist of qualitycriteria and indicators in IR system selection.Many authors like Schneiderman [7], Jacso [8], Xie

[9], and Shiri [10] have evaluated the IR systemfocusing on user's problems in searching forinformation, and have suggested modifications andenhancements to the system. For example, thesaurus-enhanced search feature [10] has been added to an IRsystem, and has been found to be useful in formulatingsearch strategies. Thus the availability of such a featurecould potentially be listed as a quality criterion from theuser's perspective.

2. Review of Literature

Quality criteria and indicators were developed asevaluation criteria for online databases, user interfaceincluding user control and ease-of-use, searchformulation aids, search enhancement features, and helpmechanism.

2.1. Evaluation criteria for online databases

Curtis [ 1] identified the following evaluation criteriaand indicators: content (appropriateness, scope, andaccuracy), technical aspects (accessibility, design andpresentation, and navigation), search features (searchinput and search output), and administrativeconsiderations. Appropriateness relates to the audience,targeted standards of curriculum and learning outcome,and readability scale for the work. Scope includessubject and year coverage, hypertext link, study guides,and duplication of resources. Accuracy coversauthoritative information, free of bias and stereotyping,frequency of updating, and free of grammatical andmechanical errors. Accessibility includes local andremote access, availability of content in other languages,and use of different browsers. Design and presentationcovers user friendliness, simple organization, unclutteredscreen displays, and avoidance of flashing images andtext noise. Navigation includes aids like icons and pull-down menus, helps users move around the resources,and operational hyperlink. Search input relates tosearch methods like Boolean, limits, advanced searchoption, and thesaurus. Search output includes rankingsearch results, display and sorting, printing and emailingitems. Administrative consideration includes publishedreviews, privacy policy statement, and copyrightpolicies.

Oulanov and Pajarillo [12] applied the SoftwareUsability Measurement Inventory (SUMI) tool, whichhas been proven for its validity and reliability inassessing new product during product evaluation, tomeasure five criteria: affect (user's feeling about using asystem), efficiency (the degree to which a system able toachieve its goal and intended task), learnability (thedegree to which a user is able to learn using the system),helpfulness (user's feeling that an system help him/her toresolve system's problems and difficulties), and control(user's feeling that a system can be easily internalized).The SUMI tool consists of 50-items questionnaire.

2.2. User interfaces

Shneiderman [7] proposed criteria that focus on userinterfaces: (a) consistency in terminology, layout andinstructions; (b) shortcuts for experienced users, (c)informative feedback about the search, (d) ability toundo or modify action, (e) user control in specifyingparameters, (f) clear error messages and correct errorseasily, and (g) alternative interfaces for expert andnovice users.

2.2.1. User control and ease-of-use

An exhaustive study on user control has beenconducted by Xie [9]. To her, user control is anotherquality criterion that incorporates many options likemore options for query formulation, construction of aprecise query, and assistance from help mechanism.Users in her study defined user control as the ability ofan IR system to construct a specific and exact query,maintain control over the search process, navigatethrough the system with understanding, and on-screenexplanation given by the help mechanism. She alsodefined user control as the requirement made byexperienced users, while beginners would ask for ease-of-use. The latter relates to "how easy a system is foreven an inexperienced user to use easily", and thebenchmark has to be users in the targeted group.

Users in her study would like to have user control andease-of-use with the query formulation features likemultiple search modes, run saved searches, browsing,field search, tools for searching like index list, searchhistory, edit/revise search, limit search, hyperlinks inresults, and find related information in other databases.

2.2.2. Help mechanism and result organization

Xie [9] found that the indicators for help mechanismare on-screen explanations, general help, sourceinformation, tips, examples, and tutorial. Users wouldlike to have useful examples especially when havingproblem with query formulation. On-screenexplanation should provide brief information aboutfeatures on the screen and is context sensitive. Sourceinformation consists of information about the database interms of subject and year coverage, and thus helps users

0-7803-9521-2/06/$20.00 §2006 IEEE. 1 739

to decide on the relevant database or IR system forsearching. Brajnik et al. [13] proposed a set ofevaluation criteria: technical help, technical help(terminological and strategic help), and user-controlledinteraction.

Another criterion identified by Xie [9] is resultorganization and evaluation, and the indicators are sortby date, sort by relevance, citation, full, and customview. In term of results delivery, users would like tosee email, download, print and clean copy features. Ofthe 234 users surveyed in the study conducted byDumais et al. [14], 5000 preferred to display their searchresults by relevance ranking and the other 500o preferredranking by date of publication. Users preferredrelevance ranking for results in research and searchengines, while ranking by date of publication is forpersonal content.

2.3. Search-enhancement aids

Many authors proposed the inclusion of searchenhancement features in an IR system to enable usersformulate quality searches. Thus the availability ofthese features and, if perceived useful, would beexpected by users as a quality criterion of an IR system.Such search enhancement aids include thesaurus-enhanced search interface [15], search formulation aids[6], side and top view of search results for furthermanipulation [14], post-query refinement and wordproximity and position [16], and link of cited references[17]. Cited references are hot-linked that allows usersto access the abstract and even the full-text of the citedwork.

Shiri [10] found that Boolean operators are usefulwhen incorporated within a thesaurus-enhanced searchinterface. However, he also found that users neededmore help in constructing the terms selected from athesaurus as Boolean search terms. In addition, he foundthat users who had less physical search move had highernumber of relevant items, thus reinforced that quality ofsearch moves is more important than quantity. Jacso[18] found that the term in the thesaurus must have ascope note, and that must include the scope, history andusage notes, and list of entry terms.

Chowdhury and Chowdhury [6] proposed a set ofcriteria that could assist users in formulating a qualityquery: search options, mode of query formulation,options for formulating complex queries, search fields,multiple database/resource search, and querymodification and saving.An IR system must have explanations with regards to

implicit proximity and positional operators [16]. Forexample, some IR systems implemented automatically aspace between two or more words as Boolean AND. Inaddition an IR system would match the exact descriptorand not part of the descriptor unless the user ended thelast word with a question mark. Users had to find thephrase or terms to alternate with the stop words likenursing at home, with home nursing.

3. Methodology

Survey, observation and self-reporting logs were usedto compile the list of quality criteria from a sample of250 users. These users have experience in searching,and know the advanced search features.

This project was limited to the 25 IR systemssubscribed by the International Islamic UniversityMalaysia during the year 2004-2005. Most users whohave been using an IR system already have a set ofreasonable quality criteria of an IR system. Thus it hasalways been the expectation that retrieval features anduser interfaces they found useful in constructing aquality search in one IR system ought to be there inother systems as well [1].

This project contributes not only on the qualitycriteria, but also these criteria came from people whohave been trained to use all these 25 databases (13different providers) and who have experience insearching for at least three information needs in IEEEXplore, ACM Digital Library, Business SourceComplete, Scopus, Medline, Science Direct, PsycInfo,SocINDEX, and Emerald Fulltext.

3.1. The questionnaire for the quality criteriaand indicators of an IR system

A 60-items questionnaire was derived from thequality criteria and expected features given in theliterature, direct observation (inclusion of unstructuredinterview) of users during training and searching in anIR system, and self-reporting logs kept by the users infulfilling their information needs. Users were given thequestionnaire and asked to complete in the presence ofthe researcher.

The questions covered measures useful for evaluatingthe extent to which an IR system satisfies theirinformation needs, quality criteria and their indicators,and expected features that could improve the system.Questions on the quality criteria were both asked in thestructured and unstructured format. Quality criteriarange from content, retrieval features, user interface,help mechanism, and system feedback to constructquality searches and manipulate search results.

4. Findings

Users agreed that recall, precision, and similaritymeasures are useful for evaluating the match betweenthe search results and their information needs. However,there is a need to go beyond these measures. They werein the opinion that in order to achieve quality searchresults, retrieval features, user interfaces, helpmechanism, and system feedback must help users intheir search formulation and allow manipulation ofsearch results.

In addition, user control is perceived by all 250 usersas the most important quality criteria. They defined

0-7803-9521-2/06/$20.00 §2006 IEEE. 1740

user control as the ability to manipulate the retrievalfeatures, term-selection and term-enhancement built-infeature, results ranking and printing, and helpmechanism. They even expected to have user controleven at the basic or quick search level. Usersmentioned that they have this perception since they haveundergone training and to some extent gained searchingexperience.

The second most important quality criteria are relatedto information visualization and third criteria areconcerned with mapping between search terms andindex terms given in the thesaurus. Even though none ofthe 25 IR systems has information visualization feature,it was suggested by the 250 users as the solution to theirsearching problem. Medline database provided by Ovidhas offered term mapping.

The fourth criteria are related to the availability of afeedback mechanism to help users construct a queryfrom a sentence or fragment that contains a stop word.The example given by Jacso [16] was home nursing(which is meant for nursing at home) and nursing home.A simple search using Boolean operator, nursing ANDhome, retrieved items on nursing home and also homenursing. Of the 250 users, 176 have problems incoming-up with the term home nursing for nursing athome during the training.

Users mentioned that all features of the systems mustwork as claimed, and that this is the thrust of all thequality criteria.

4.1. Search Formulation

Users were satisfied with the retrieval features, exceptthat they had to remember the interpretation andimplementation of each feature in these 25 systems.They agreed that Boolean operators, word/phrase search,proximity, stemming, truncation, and wildcard are usefulin query construction. A simple and brief explanationlike those for phrase search in ACM Digital Library(Fig. 1) should be included in all IR systems. Anotherway would be the provision of a context-sensitive help ateach retrieval feature.

e (Full Service) Register (Limited Service, Free)The ACM Digital Libr j The Guide

Advanced Search m Search

Enter words, phrases or names below. Surround phrases or full names with double quotation marks.

Search within Results: 0 foundPlease Modify or Clear the Result Set!

Desired Results:must have all of the words or phrases

must have any of the words or phrases

must have none of the words or phrases

Only search in:'

Figure 1. AdvancedDigital Library

Name or Affiliation:Authred by: all any r none

Edite by rall any r none

Rev-we-by: c all c any Cnone

search option in ACM

They would also like to include term density offeredin IEEE Xplore as <many> operator and the breakdown

of 'with' operator as <paragraph> and <sentence> (Fig.2).

Figure 2. List of search operators inXplore

In addition, the fuzzy search and term boosting (e.g.retrievalA4 system means the term retrieval is 4 times theweighting given for the term system) offered by EmeraldFull-text are also perceived as important quality criteria.However, 109 users stated that the fuzzy search is notreally a requirement for a quality IR system, unless it hasbeen improved to cover tefm, its spelling variations,synonyms and antonyms. To retrieve terms that appearin between two terms in an alphabetical arrangementshould be named as 'range searching' rather than fuzzysearch.

Users required a clear guide on how to apply theretrieval features to find the right information. Theycomplained that search examples and tips are oftenmissing from the system, and even if given, theseexamples are not clear and confusing. Thus clarity anddetailed examples are needed.

4.1.1. Thesaurus.

Users perceived the thesaurus to be more as anindexing tool and as reference for the terms assigned byindexers to represent documents, rather than as aretrieval tool for the users. Thus, to them, thearrangement is done in a hierarchical manner to reflectbroader, narrower and related terms. Users selectedbroader terms with the intention to include all thenarrower and related terms. Broader terms actuallymean that the document discusses a broad topic orsubject. Users should have selected the explode featureif their intention were to include the narrower andrelated terms. Currently, no information is available inthe help menu or guide of any of these IR systemsexplaining the appropriate use of thesaurus.

0-7803-9521-2/06/$20.00 §2006 IEEE.

nAA9 1- ikpg.k.p

1741

The thesaurus should have information visualization.Terms in the thesaurus must be shown visually as

clusters and relationship within and between clusters.The term image is placed in multimedia cluster, hasrelationship with the term texture of the same

multimedia cluster, and has relationship with the termsimilarity in the retrieval performance cluster. Usersfound it easier to visualize terms in cluster instead of a

hierarchical structure. In addition, the system mustdisplay number of hits to each term-to-term relationship.Mapping provided by Ovid Medline should bemaintained with a slight improvement, that is, users mustbe informed of the terms that would be mapped andusers are able to omit terms.

Users would like to maintain the ability to limit thesearch to descriptor and also author-keyword fields (Fig.3). The descriptors and author-keywords mustrepresent the whole document. Users upon looking atall the terms in the descriptor and author-keyword fieldsare able to judge the content of the document. Scopenotes together with term usage, synonyms and antonymsmust also be included.

Classification: PirOl Results must hane accessible:Classified as: all any rnone rFullText rAbstract rReview

Subject Descriptor: all rany rnone

Keyword Assigned: all rany rnone

The ACM Portal is published by the AssociationLor Computing MachineryrCopyriqht 2006 ACM, Inc

Figure 3. ACM Di'gi'tal Library

The <thesaurus> operator in IEEE Xplore promisesthe retrieval of the term and its synonyms (Fig. 3).However, the system failed to retrieve documents on

electronic banking to the query <thesaurus> onlinebanking since the document (Fig. 4) does not have theterm online banking. The system retrieved documentson online banking only.

4.2. Search results

Users would like to have control over ranking byrelevance and publication dates, saving search results,printing, and emailing items. System must provide linkto similar items within the database and on the Internet.The citing and cited articles offered by Business SourcePremier (EbscoHost) are found to be useful. Userssuggested the ability to print items in .pdf file to enableprinting at their convenient time and to avoid peak hoursat the printing site.

In addition, users would like to be able to refinesearches through lateral searching. Visualization of thesearch results is seen as a critical need by 183 users.Such visualization gives a clue as in what way the searchresults are related and correspond to their query. Afeedback mechanism should be incorporated to suggest abetter way of searching if the query formulated by theusers did not reveal much success. The feedback couldrefer users to search guide or online tutorial.

4.3. Quality criteria and indicators

The quality criteria and indicators derived for thisproject are given in Table 1.

Table 1. User's quality criteria and indicatorsQuality Criteria Indicators

1. Content a. Subject and year coverageb. Surrogates and full-text in .pdfc. Research papers, conference

papers, and case studies asthe main document types

d. Cited and citing articlese. Surrogates consist of index

terms and author-keywordsthat explained the content ofthe whole document

2. Retrieval a. Boolean operatorsfeatures b. Word and phrase

c. Proximityd. Term densitye. Stemmingf. Truncation and wild cardg. Lateral searchingh. Term selection from a

thesaurusi. Limit fieldj. Find similar itemsk. Term variationsI. Explode and expand searchm. Rule of precedencen. Range searchingo. Term boostingp. Fuzzy searching to include

synonymsq. Link featuresr. Options for ranking resultss. Saving, emailing and printing

capability3. User interface a. Pull-down menu

b. Query builderc. Search within results

0-7803-9521-2/06/$20.00 §2006 IEEE.

I___________I____ d. Refine search

1 742

The quality criteria and indicators show that users are

very concerned and understand the need to conductquality searches to retrieve quality search results.

5. Conclusion

Features currently offered by an IR system satisfiedthe retrieval at the data level and not at the informationlevel. Users found difficulties in finding itemscontaining relevant information even though they knewhow to use the IR systems. The quality criteria andindicators suggested that an IR system should beevaluated as to whether it is able to assist users insearching for relevant information. Thus quality for an

IR system is now a function of the system's ability toassist users in fulfilling their information needs.

These qualities no doubt were the users' expectedability of the system. Furthermore, users evaluated thesystem based on their expectations. Users suggestedsome features as solutions to their searching problems.Such expectations and solutions can be translated into a

list of quality criteria and indicators from user'sperspective. The qualities suggested by these users

could be refined and expanded as a checklist for testingthe quality of the system. Features suggested by users

could be replaced by other features with similar functionby the system provider.

6. References

[1] R. Othman and N.S. Halim, "Retrieval Features for OnlineDatabases: Common, Unique and Expected", OnlineInformation Review, Emerald Group Publishing, Bradford,England, 2004, pp. 200-210.

Emerald Group Publishing, Bradford, England, 2004, pp. 425-432.

[3] A. Tombros and C.J. van Rijsbergen, "Query-SensitiveMeasures for Information Retrieval", 2004, available at:http:llwww.dcs.gmul.ac.u.k,/-tassos/publications/kais04.pdf,accessed [August 2004]

[4] S.E. Robertson and M. Hancock-Beaulieu, "On theEvaluation of IR systems", Information Processing andManagement, Elsevier, Netherlands, 1992, pp. 457-466.

[5] Lancaster, F.W. and A.J. Warner, Information RetrievalToday, Information Resources, Arlington, 1993.

[6] Chowdhury, G.G. and S. Chowdhury, Introduction toDigital Libraries, Facet Publishing, London, 2002.

[7] Schneiderman, B., Designing the User Interface: Strategiesfor Effective Human-Computer Interaction, Addison-WesleyLongman, Menlo Park, California, 1998.

[8] P. Jacso, "Using Controlled Vocabulary (Content Part)",Online Information Review, Emerald Group Publishing,Bradford, England, 2003, pp. 284-286.

[9] H. Xie, "Supporting Ease-of-Use and User Control: DesiredFeatures and Structure of Web-Based Online IR Systems",Information Processing and Management, Elsevier,Netherlands, 2003, pp. 899-922.

[10] A.A. Shiri and C. Revie, and G. Chowdhury, "Thesaurus-Enhanced Search Interfaces", Journal of Information Science,2000, Sage Publications, Thousands Oak, California pp. 111-

122.

[11] D. Curtis, "Evaluation Criteria for Online databases",2002, available at: "http:Hwww.bcpl.net/-dcurtis/reference/images/Evaluation%20Criteria.pdf, accessed [September 2004]

[12] A. Oulanove and E.J.Y. Pajarillo, "Usability of Evaluationof the City University of New York City+ Database", TheElectronic Library, Emerald Group Publishing, Bradford,England, 2001, pp. 84-91.

[13] G. Brajnik, S. Mizzaro, and C. Tasso, "Evaluating UserInterfaces to Information Retrieval Systems: A Case Study on

User Support", Proceedings of SIGIR '96, 1996, Zurich,Switzerland, ACM Press, pp. 128-136.

[14] S. Dumais, E. Cutrell, J.J. Cadiz, G. Jancke, R. Sarin, andD.C. Robbins, "Stuff I've Seen: A System for PersonalInformation Retrieval and Re-Use", Proceedings ofSIGIR '03,2003, Toronto, Canada, ACM Press, pp. 72-79.

[15] A.A. Shiri, C. Revie, "Thesauri on the Web: CurrentDevelopments and Trends", Online Information Review,Emerald Group Publishing, Bradford, England, 2000, pp. 273-279.

[16] P. Jacso, "Query Refinement by Word Proximity andPosition", Online Information Review, Emerald GroupPublishing, Bradford, England, 2004, pp. 158-161.

[2] R. Othman, "An Applied Ethnographic Method forEvaluating Retrieval Features", The Electronic Library,

0-7803-9521-2/06/$20.00 §2006 IEEE.

e. Journal browsingf. Index browsing

4. Thesaurus- a. Information visualization ofenhanced search terms and term-to-term

relationshipb. Visualization of the search

resultsc. Term mappingd. Phrase to alternate terms with

stop wordse. Detailed scope note with

synonyms and antonyms5. Help a. Detailed explanations of themechanism retrieval features

b. Clear search examplesc. A list of search operatorsd. Context-sensitive helpe. Search guide or online tutorial

6. Feedback a. Suggest terms from thesaurusmechanism b. Refer users to search guide or

online tutorial7. Administrative a. Copyright policyconsiderations b. Password for off-campus

access

1743