an overview of evaluation methods - umit · organisational coordination and conflicts data &...

32
Jytte Brender Jytte Brender Associate Research Professor Associate Research Professor Virtual Centre for Health Informatics Virtual Centre for Health Informatics & Aalborg University Aalborg University An Overview of An Overview of Evaluation Evaluation Methods Methods

Upload: others

Post on 27-May-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

Jytte BrenderJytte BrenderAssociate Research ProfessorAssociate Research Professor

Virtual Centre for Health InformaticsVirtual Centre for Health Informatics & Aalborg UniversityAalborg University

An Overview of An Overview of Evaluation Evaluation MethodsMethods

Page 2: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

Evaluation …is the act of measuring or exploring properties of a health information system (in planning, development, implementation, or operation),

the result of which informs a decision to be made concerning that system in a specific context

[Ammenwerth et al. 2004]

What is ’evaluation’?What is ’evaluation’?

measuring or exploring properties

the result of which informs a decision to be made

Page 3: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

Metaphorically speaking, evaluation is like a torch being waved in the dark in front of you, • screening for options, obstacles and barriers, • diagnosing problems within a vast of observations, • prognosing risks, while • monitoring success and failure indicators,

but NOT:

… for treating identified obstacles and symptoms of failure

Evaluation is seen as a means to:• maximise success • minimize failure

Page 4: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

My intention is to view available methods in the perspective

of the nature and needs of evaluation

• nature of different systems

• success & failure factors

• nature of evaluation methods

• pitfalls & perils in evaluation

inventory ofevaluation methods

Page 5: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

low mental load(repetitive activities)

high mental load(cognitive activities)

high mental load(ergonomic activities)

EHCREHCREduc.Educ.

RIS/RIS/PACSPACS

LISLISKBSKBS

ES/DSSES/DSS

HISHIS PASPAS

• circumscribed tasks• repetitive tasks• many users• invariant staff

• complex tasks• data phlethora• decision support• fewer users• invariant staff

• complex tasks within a dynamically changing context• unique cases• decision support • many users• composite and variable staff

speed / effe

ctiveness

compre

hens

ibility

PP

Page 6: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

assessment

focus

type ofsystemHIS/

PASLISRIS

EHCReducation

ergonomic ass.

cognitive ass.

KBSES/DSS

functionality ass.

impact ass.

techn. verification

Page 7: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

Where are the success and failure indicators?Where are the success and failure indicators?

Everywhere!– but mainly within the many processes that lead to an IT-based solution

• Functionality• Organisation • Managerial• Behavioural• Political• Cultural• Technical• Publicity• Economy• Education

Page 8: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

What are the barriers to success ?What are the barriers to success ?

Major Obstacles to Success Major Obstacles to Success [Crosswell 1991][Crosswell 1991]

Organisational coordination and conflictsData & software standardsPlanning / management supportTraining / understanding of technologyDatabase structure & source materialFunding availabilityData communication and networkingApathy / fear of changeSW complexity/maturityStaffing availability / recruitment

39 cases

5% affected

54% affected

20% affected

Page 9: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

What are the barriers to success ?What are the barriers to success ?Obstacles to Success Obstacles to Success

[Price Waterhouse 1997][Price Waterhouse 1997]Insufficient management of change skillsCompeting resource prioritiesPresence of functional barriersLack of middle management supportInsufficient communicationEmployee oppositionInsufficient training and coachingNo perceived need for changeLong lead-time on IT solutionsInitiative fatigueBottom line benefits not understoodInsufficient involvement of employeesInsufficient focus on people issuesInappropriate leadership styleLack of clarity of objectives and vision

40% affected

20% affected

500 cases

Page 10: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

Critical success factors !Critical success factors !

Critical Success CriteriaCritical Success Criteria[Price Waterhouse 1997][Price Waterhouse 1997]

Ensuring sponsoring from the topTreating employees fairlyInvolving employees throughoutQuality communicationQuality trainingClear performance measuresBuilding teams after changeFocus on culture / skill changesRewarding successInternal champions/change agentsManaging stakeholdersFormal project managementPhasing the changes

500 cases

rated highly important by 82%

rated highly important by 44%

Page 11: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

Critical success factors !Critical success factors !

Top Indicators of Success Top Indicators of Success [Bikson & Evelyn 1989][Bikson & Evelyn 1989]

• User participation• Flexible planning balancing social and technical issues• Long-term learning support• Users identify the benefit from change for them• The change perceived as leading edge

55 cases

Page 12: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

The people and organisational issuesseem to be a key figure to success

A system is”All the components, attributes and relationships needed to accomplish an objective”

[Haimes & Schneiter 1996]

Page 13: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

Available methods and toolsAvailable methods and tools... introduction to an inventory of methods

Page 14: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

icons that together characterises the

assessment method’sapplication area

within the lifecycle of the IT-system,

including the applicability for the

mentioned purpose.

The method’s application areais concerned with what it may be applied for.

What does the method include and how is this accomplished.

Necessary (non-trivial) conditionsfor application of the method with a useful result.

Aspects– like the philosophy behind the method,

including hidden assumptions – of importancefor the accomplishment or the interpretation of results,

Where, how and why one may risk specific pitfalls or perils.

Small tricks/advices that may help the accomplishment of the investigation or how to modify the method.

A complete list of references used from the literature, and of supplementary valuable references.

☺☺

Pitfalls and perils:

Tips and comments:

Assumptions for application:

Description:

References:

Perspectives:

Application area:<method name>

Frame of reference: What might serve as the frame of reference.

Page 15: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

Questionnaireshas been used to ask for almost anything, but

should primarily be used to elicit qualitative measures of subjective aspects

☺☺ €(€)§(§§)

may different types of questions that may be combined at random

validation of the questions, and use of the right analysis tools

”Everyone can formulate questions”

Pitfalls and perils:

Tips and comments:

Assumptions for application:

Description:

References:

Perspectives:

Application area:

Frame of reference: Depends on the purpose of the study

• internal validity• context of response• psychological aspects• linguistic aspects• cultural aspects• who to ask

Page 16: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

• validation of objectives fulfilment• impact/effect assessment• id. of problems in IT <-> organisation interaction

☺☺ €€€§§

Pitfalls and perils:

Tips and comments:

Assumptions for application:

Description:

References:

Perspectives:

Application area:

Frame of reference:

Functionality Assessment

• information access• diagramming experience

• the organisation changes dynamically• Leavitt’s model for organisational change• Mumford’s model for Participative Design

Built into the method

1. Other frameworks may be applied, like structure, process, outcome

2. Any diagramming technique may be applied

Brender J. Methodology for Assessment of Medical IT-Based Systems, IOS Press Vol 42, 1997

The causal analysis is vulnerable, because it may be difficult to access/elicit/verify the

true causal mechanisms

23

4

environmenttechnology

actors

task

structure

re-designof system

causalanalysis

=?

Divergences

Normativesystem

Step 1

Realsystem

Step 2

Step 3

Step 4

Step 5

Page 17: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

• Assessment of system design and early prototypes• Assessment of demos

Cognitive walkthrough

Measurement of diagnostic accuracy and precisionClinical/diagnostic performance

Strategic managementBalanced Scorecard

A set of techniques for comparative assessment of bidsAssessment of bids

• Identification of where and why operation errors occur• Identification of focus point for improvement of user friendliness and usability

Cognitive assessment

• Trends assessment • Assessment of the future

Delphi

Critical, subjective assessment of an existing practiceBIKVA

Application purposeMethod

Inventory of methodsInventory of methods

Page 18: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

Situation analysis aimed at identification of focus points for change

Future Workshop

Text-analysisGrounded Theory

Assessment of objectives fulfilment, and identification of focus points for change

Functionality assessment

Comparative judgement of development strategiesFramework for assessment of strategies

Observation of and within an organisation to identify practice and mechanisms controlling change

Field study

Combined assessment of degree of objectives fulfillment and of side effects (impact assessment)

Effect assessment

• Elicitation of patterns of attitudes, ways / modes of acting within/among social groups• Elicitation of problem areas within the organisation or the IT-based solution

Focus Group Interview

Application purposeMethod

Page 19: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

Qualitative investigations of subjective aspects, like attitudes and opinions

Questionnaires

Measurement of trends, in particular effect of a specific initiative

Prospective time studies

Assessment of a User Requirements SpecificationRequirements Assessment

Identification of future scenariosPardizipp

Impact AssessmentModelling of work procedures

Incremental optimization of outcome to comply with policy and aim.

KUBI

(anything)Heuristic evaluation

Situation analysis, aiming at identifying focus points for change

Logical Framework Approach

Exploration of individuals’ attitudes, opinions, and perception of phenomena and observations

Interview

Application purposeMethod

Page 20: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

Assessment of opinions, attitudes and perceptionsUser acceptance and satisfaction

Situation analysisSWOT

Assessment of relations between elements in an organisation, influencing acceptance and use of future solutions

Social network analysis

Evaluation of efficacy, i.e. that the IT-system (under ideal circumstances) improves patient care

RCT

Monitoring of risk aspectsRisk Assessment

Assessment of who to involveStakeholder analysis

• Analysis of complex patterns of interaction• Data acquisition and documentation in general

Video recording

Assessment of user friendliness in terms of ergonomic aspects

Usability

Systematic assessment of requirements fulfilment as part of an acceptance trial

Technical verification

Application purposeMethod

Page 21: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

There are plenty of methods and frameworks, and a fair number of methodologies to address users’ need for evaluation

... but are they sufficient?

Metaphorically speaking, evaluation is like a torch being waved in the dark in front of you, • screening for options, obstacles and barriers, • diagnosing problems within a vast of observations, • prognosing risks, while • monitoring success and failure indicators,

Page 22: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

My intention is to view available methods in the

perspective of the nature and needs of

evaluation

• nature of different systems

• success & failure factors

• nature of evaluation methods

• pitfalls & perils in evaluation

Inventory of evaluation methods

Page 23: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

technicalaspects

people aspects

organisationalaspects

organisationalaspects

technicalaspects

people aspects

technicalaspects

people aspects

organisationalaspects

2D 3D

4D

... takes into account the complexity, like natural variation in daily work and practice

... takes into account both the complexity and the dynamically changing situation, as well as their interactions

structure

technology

actors

taskmanagem.main-

tenance

requestinganalytic.chemical

work

product.

calibr./calculat.

technic.inspect.

qualitymanagem.

needs,schedules

samples

rerun-information

rawdata interim

results interimresults

quality control criteria

change parameters

productioncontrol procedures

information

laboratory service order information

order

priority

Page 24: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

So,

the challenge implied by the Information Age is the increasingly complex IT-solutions and the implied change management

... and

the challenge for the evaluation professionals is to continue developing appropriate methods,

while continuing to identify and exploreindicators for succes & failure

technicalaspects

people aspects

organisationalaspects

Page 25: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

Another challenge: (typical) pitfalls and perilsAnother challenge: (typical) pitfalls and perils

Pitfalls and perils were identified within themedical and social sciences

- all have analogues at experimental assessment of IT-based solutions within the healthcare sector- very few evaluation studies for IT-based solutions within the healthcare sectorsucceed to avoid even the simplest ones

Page 26: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

Study design biasesStudy design biases

Aspects of culture

Circumscription of study objectives

Selecting the study methodology/method

Defining methods & materials

(User) Recruitment

(Case) Recruitment

The frame of reference

Outcome measures or end-points

Construct validity of methods, techniques and tools

Prophylactic grounding

Prognostic measures

The (un-)used user

Diagnostic suspicion

The placebo effectHawthorne Effect

Co-interventionA before-and-after study

Carry-over EffectCheck-list Effect

Human cognition/recognitionUsability and user acceptability(Adverse) confounding effects

More than one population included as the subject of investigation

Compliance between application domain and target domain

Use of artificial scenarios or set-up

Subjectivity versus objectivityAssumptions for methods’ application

Circular inferenceApplication range

The Substitution Game

Comparing apples and pears

Technical verification of outcome measures

Constructive assessment

Complex causal relations

Delimitation of the investigation spaceAmbiguous or changing study objectives

Stakeholder analysisTargeting the methodology/method to the study aim

Page 27: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

Biases related to data collection, Biases related to data collection, analysis and interpretationanalysis and interpretation

Beyond the developer’s point of viewUnstable technology

Hindsight or insight biasJudgement of the probability of eventsJudgement of performanceMiscellaneous judgemental biases

Assessment BiasEvaluation paradox

Constructive assessmentAction-case research

Half of the storyPresentation of the future

Exhaustion of the data material

A publication dilemma

Matching of control and study groupsImplicit domain assumptions

Competing risks

False-Negative StudiesFalse-Positive studies

Validation is not black-and-whiteRandonised controlled trialsAction-case Research

The developers’ actual engagementIntra- and Inter-person (or -case) variabilityIllicit UseFeedback EffectExtra WorkJudgmental biasesPost-rationalisationVerification of implicit assumptionsNovelty of the technology – technophile or technophobeSpontaneous regressFalse conclusionsIncomplete studies or study reportsHypothesis fixationThe ‘Intension to treat’ principleImpact

Page 28: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

... and then what is ...The practical implication?The practical implication?

It may meanIt may mean

-- everything!everything!-- or nothing!or nothing!

Page 29: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

Flaws may completely ruin a studyFlaws may completely ruin a study-- examplesexamples

Two presentations on the same IT-system had opposite conclusions

Triangulation proves that one cannot trust the users’ statements

Some uses t-test for verification of the significance at questionnaire studies with ordinal ratings

Page 30: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

You cannot – with sense – compare just anything

The users don’t do as they say they do, ... but does this matter when it is their reality that has to be judged for the IT-system?

If the users express that an IT-system is good, does it then matter that someone can prove the opposite?

A lot of the pitfalls and perils have no or marginal effect in a given evaluation study

Flaws may have no or minor implicationFlaws may have no or minor implication-- examplesexamples

Page 31: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

The issue of evaluation is not simple!

Page 32: An Overview of Evaluation Methods - UMIT · Organisational coordination and conflicts Data & software standards Planning / management support Training / understanding of technology

jyttejytte..brenderbrender@[email protected]