towards task analysis tool support

Post on 19-Jul-2015

83 Views

Category:

Technology

3 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Suzanne Kieffer1

Towards Task Analysis Tool Support

Panos Markopoulos2Nikolaos Batalas2

1Université catholique de Louvain

Louvain School of Management

Louvain-la-Neuve, Belgium

2Eindhoven University of Technology

Industrial Design

Eindhoven, The Netherlands

User goals, tasks and work environment

User errors, breakdowns in the task and workarounds

Task Analysis

Task Analysis

Other Usability Engineering Tasks

Usability Goals Setting

Work Reengineering

User Interface Design

Task Analysis remains resource intensive

Face-to-face interaction

User observation

Note taking

Audio/video recording and transcribing

Data collection

Analyst efficiency

Analyst workload

User time and effort

In situ data collection Ambulatory Assessment methods

Room for improvement

Purpose: to assess the ongoing behaviour, knowledge and experience of people during task execution in their natural setting

Examples: experience sampling, repeated-entry diaries, ecological momentary assessment, acquisition of ambient signals

Ambulatory Assessment (AmA)

To which extent can AmA methods support in situ data collection during

task analysis procedures?

1. Task model hypothesis Analysis of procedures and artefacts Setting of questions and experimental design

2. Tool-supported in situ data collection Users: expertise and responsibility Tasks: frequency, criticality and complexity Problems and errors

3. Contextual observations/interviews

Method

Case study

Hot-Dip Galvanizing on Continuous Lines

Step 1 – Task model hypothesis

Artefact:paper checklist

Q1. Please indicate your degree of

familiarity with this task

Q2. How frequently is this task

executed?

Q3. Please indicate when it was

executed for the last time

Q4. Please indicate when it will be

executed next time

Q5. Please select all the possible

contexts where it takes place

Q6. Why does it have to be

executed?

Q7. Please indicate a mean to facilitate

or to improve this task

Q8. Please give an example of

possible problem during its execution

Q9. Please give an example of error

committed during its execution

Q10. Please select in the list all the

participants to this task

Q11. Please indicate who asks for its

execution

Q12. Please indicate to whom the

related result is communicated

Setting of questions

Experimental setup

30 items

4 key users

3 shifts

12questions

12 participants29 items x 12 questions

+ 1 question

4200 questions 350 questions per participant

9 days 40 questions a day per participant

Step 2 – In situ data collection

TEMPEST

1. Prepare your material (questions and protocol)

2. Program sequences of questions

3. Create participants

4. Fire questions

5. Analyze answers

Step 3 – Contextual observations/interviews

Observations/interviews

Key functions Team leader Mobile operator Bath operator

Key aspects Communication flow Countdown of items Intra-team collaboration Problems or errors

Results & discussion

Challenge

Unfriendly work environment

Complex work organization

Collaborative

Distributed in space and time

Rotating shifts

With vs. without tool-support

With TEMPEST Without TEMPEST

Analyst’sEfficiency

Increased productivityIncreased accuracy

Limited productivityRisk of mistakes

Analyst’sWorkload

Automated & remoteSafe & comfortableStructured process

Manual & face-to-faceDifficult & tedious Unstructured process

User’stime & effort

38 hours overall in 9 days20 minutes a day per user

36 hours overall (estimation)3 hours per user (estimation)

QuestionsTimely with snooze optionRather not intrusive

DisruptiveIntrusive

Answers Complete results Fragmented results

Requirements

Supporting tools

Analyst configurability

Real-time monitoring and traceability of responses

On the fly adaptation of the sampling protocol

Data collection across platform (responsiveness)

Task model hypothesis

Guidelines for analysts

Mapping with the sampling protocol

Mapping with the responses

Take away

Task Analysis Tool Support (TATS)

Method and TEMPEST

Feasibility and cost-efficiency of TATS

Requirements for conducting TATS

Contact detailssuzanne.kieffer@uclouvain.be

n.batalas@tue.nl

p.markopoulos@tue.nl

TEMPEST survey http://goo.gl/DTgdqC

Thank you!

Definition of the key users

Divergences

Convergences

Reasons to execute a task (Q6): instructions, cleanliness and quality

Means to improve the tasks (Q7): automation, better care of the zinc bath and new equipment

Problems (Q8): technical problems and accidents

Errors (Q9): related to manipulation of the zinc bath and lack of time

“The questions interfered with my schedule”

Satisfaction questionnaire, 5-item Likert scale

Shift A=3.50, equally distributed between “neutral” and “agree” Shift B=2.67 Shift C=2.25

Most of the participants (10/12) thought they answered between 15 and 30 questions a day, while they actually answered about 40

top related