the pragmatic evaluation of tool system interoperability

Post on 05-Dec-2014

441 Views

Category:

Technology

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

A. de Moor (2007). The Pragmatic Evaluation of Tool System Interoperability (invited paper). In Proc. of the 2nd ICCS Conceptual Structures Tool Interoperability Workshop (CS-TIW 2007), Sheffield, UK, July 22, 2007. Research Press International, Bristol, UK, pp.1-19.

TRANSCRIPT

The Pragmatic Evaluation of Tool System Interoperability

Aldo de Moor CommunitySenseCS-TIW, July 2007

Once upon a time...

Those days are gone!

Tool systems

Tool system the set of integrated and customized information

and communication tools tailored to the specific information, communication, and coordination requirements of a collaborative community

No standard prescriptions Communities need to evaluate the

functionalities in their unique context of use

Technical comparison is not enough

The “orchestra metaphor”

How to create a well-tuned orchestra of tools able to perform a magnificent symphony?

Go beyond the technical abilities of the individual tools

Practice, trial and error, leads to synergy and alignment

Where is the conductor?

Tool system interoperability

How to assess the interoperability of a tool system in a particular usage context?

Interoperability The need to make heterogeneous information

systems work in the networked world (Vetere-Lenzerini)

The ongoing process of ensuring that the systems, procedures, and culture of an organisation are managed in such a way as to maximise opportunities for exchange and re-use of information, whether internally or externally (Miller)

Pragmatic evaluation

Much research focuses on syntactic and semantic interoperability UDDI standard

Universal Description, Discovery and Integration Rules for building service directories and

facilitation of top-down querying Pragmatic interoperability?

Link standards to context-dependent needs of user communities

How?

Questions

How to conceptualize the usage context in tool system interoperability evaluation?

What would an evaluation procedure look like?

How would such a procedure influence design choices?

Goals

Construct minimal conceptual model of pragmatic evaluation methods for tool system interoperability

Can be used for developing whole classes of methods specifically tailored to communities

Make pragmatics explicit Find common ground for pragmati(ci)sts,

tool builders, and designers

Case: co-authoring a call for papers

2006 International Pragmatic Web conference

Three co-chairs in different countries Write call for papers by e-mailing around

Word-files PragWeb new paradigm, confusion

abounded, no convergence Co-evolution of requirements led to

satisfactory tool system solution

Co-authoring tool system v1

Author 1

Author 2

VersionAuthor 2

VersionAuthor 1

VersionAuthor 3

Author 3

Co-authoring tool system v2

Author 1

Author 2

VersionAuthor 2

VersionAuthor 1

VersionAuthor 3

Author 3

Conference

Co-authoring tool system v3

Author 3 / EditorAuthor 1 Author 2

Conference

Agreedlines

(Modified)paragraphs

Chat

VersionAuthor 1

VersionAuthor 1

VersionAuthor 1

Version-inProgress

A conceptual model of the tool system

Functionality a set of functions and their specified properties

that satisfy stated or implied needs (SEI)

Different levels of granularity Systems Tools Modules Functions

Interfaces, information objects, info/comm processes

Example

A conceptual model of the usage context

(De Moor, 2005) Patterns for the Pragmatic Web

pragmatic context is common context + set of individual contexts Concepts, definitions, communicative

interactions, context parameters Focus on meaning negotiation process

Current focus: the pragmatic patterns themselves

Usage context: goals

Goals: activities, aspects Sense of purpose, drive people and processes,

evaluation criteria

Activities Operationalized goals: processes with concrete

deliverable as outcome E.g. writing a call for papers, making a group

assignment High-level workflows, interested in potential

functionalities, not implementation details

Aspects Abstract goals cutting across processes and structures E.g. security, interactivity, effectiveness

Usage context: actors

“The user” does not exist Many stakeholders, with their own needs,

interests, and goals Actor roles increasingly important

responsibilities in workflows access to functionalities and information

resources E.g. Role-Based Access Control paradigm

Actor role typologies

Currently mostly technology-focused Administrator, Facilitator, Member,...

Need to become much more contextualized Customized responsibilities and access rights

Examples Workflow-based

Author, Reviewer, Editor, ... Organization-based

Secretary, Manager, Team Leader, ... Domain-specific

Env. Protection Agency, Corporation, NGO, ...

Usage context: domain

Major influence on evaluation processes and tool system functionalities

Still ill-understood Determinants

Structure and size: e.g. distributed, centralized, small, large

Setting: academic, corporate, gov, non-gov Financial: resources for customization or off-the-shelf

software only? Political: certain software choices

mandatory/prohibited?

The pragmatic evaluation process

The scoring process

The main process in which stakeholders reflect on role of functionalities in complex usage context

Many ways to do so E.g. Bedell’s method for IT functionality

effectiveness evaluation Score functionalities on effectiveness and

importance for activities Problem: complex, time-consuming, many

levels of aggregation

A practical method for courseware evaluation

Questions1. How well are the various activities supported by the

various functionalities?

2. How effectively are the various functionality components used?

Goal scores and functionality scores Users in their actor roles provide, interpret and use

scores in decision making Context: courseware evaluation:

Actors: students, software manager Tool system level: module

Goal and functionality scores

Elements I(g) = importance of a goal I(f,g) = importance of a functionality in

supporting a goal Q(f,g) = quality of a functionality in supporting a

goal

G-Score = I(fi,g) * Q(fi,g), for all functionalities 1..i

F-Score = I(gj) * I(f,gj) * Q(f,gj), for all goals 1..j

Experiment: group assignments

Two courseware tools: Blackboard, CourseFlow Goal: making group assignments Four activities, 11 functionality modules Actors: 2nd year Information Management students, software

manager 2002: 62 students, 16 groups 2003: 46 students, 12 groups

Questions Quality of tools for various group assignment activities? Usefulness of various functionality modules?

Activity scores

Functionality scores

Evaluation ++ More advanced goal concepts, e.g. maintainability

However, tradeoff between methodological power and ease-of-use!

Link to existing activity and quality aspect frameworks Activities: e.g. BPMN, workflow patterns Aspects: IS quality frameworks, e.g. Delen and Rijsenbrij, DeLone and

McLean

Link to existing evaluation methods from quality IS literature Contrast evaluations by different actors

Students have different interests from the lecturer! Build on techniques for multi-stakeholder dialogues

Better balance informal and formal approaches: hermeneutic approaches meet conceptual structures

Link to applied pragmatic philosophy in IS devt, e.g. testbed devt methodologies – Keeler and Pfeiffer trikonic architectonic – Richmond active knowledge systems – Delugach goal-oriented transaction modeling – Polovina et al.

Evaluating virtual worlds

Conclusions Functionality selection = balancing

Collaborative community requirements Interoperable tool system

Pragmatic evaluation of tool system interoperability Socio-technical evolution of Tool system Usage context

Conceptual framework for pragmatic evaluation method construction and comparison

Fundamental problem Infinite variety of usage contexts Balance needed between formal and informal interpretation

Conceptual structures tools could be the missing link between the human capacity to interpret context with computational power to analyze patterns

Indispensable for the continuously evolving, context-sensitive collaboration systems of the future

top related