advanced workflow management technologies

47
SOFTWARE PROCESS—Improvement and Practice Softw. Process Improve. Pract. 4, 125–171 (1998) Advanced Workflow Management Technologies Gregory Alan Bolcer 1 and Richard N. Taylor 2 * 1 Endeavors Technology, Inc., Santa Ana, CA, 92705, U.S.A. 2 Information and Computer Science, University of California, Irvine, Research Section CA, 92697–3425, U.S.A. Process, workflow and groupware projects both in the commercial and research worlds have each approached the problem of human communication and coordination with different sets of tools and technologies implemented from different perspectives. Activity dependencies and the philosophical assumptions about how work is performed are managed differently in each discipline. While most of these tools represent promising approaches, they have yet to be widely adopted, particularly in the face of increased usage of online information through the WWW and computer networks. Electronic information coming to the individual and into the corporation is becoming increasingly unmanageable. People are able to create information regardless of their location or method of access due to increasing mobility and interoperability of the tools they use. Because of the common concerns, trends are moving towards overlap and convergence of requirements. However, these communities have yet to identify the union of cross-discipline requirements for a supporting technical infrastructure addressing the level of flexibility, scalability and support for evolution over time required in these real-world settings. This paper surveys the current approaches to project communication and coordination, details the range of requirements across several related disciplines, and identifies various tradeoffs and trends which may effect the adoption, design, and evolution of an advanced technical workflow infrastructure. Published in 1998 by John Wiley & Sons, Ltd. This article is a US Government work and is in the public domain in the US KEY WORDS: workflow; process; groupware; requirements; tradeoffs; trends *Correspondence to: Richard N. Taylor, Information and Com- puter Science, University of California, Irvine, CA, 92697-3425, U.S.A. Email: taylorKics.uci.edu Effort sponsored by the Defense Advanced Research Projects Agency, and Air Force Research Laboratory, Air Force Material Command, USAF, under agreement number F30602-97-2-0021. Published in 1998 by John Wiley & Sons, Ltd. The U.S. Government is authorized to reproduce and distribute reprints for Governmental purposes notwithstanding any copyright annotation thereon. The views and conclusions contained herein are those of the authors and should not be interpreted as necessarily representing the official policies or endorsements, either expressed or implied, of the Defense Advanced Research Projects Agency, Air Force Research Laboratory or the U.S. Government.

Upload: gregory-alan-bolcer

Post on 06-Jun-2016

215 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Advanced workflow management technologies

SOFTWARE PROCESS—Improvement and PracticeSoftw. Process Improve. Pract. 4, 125–171 (1998)

Advanced WorkflowManagementTechnologies†

Gregory Alan Bolcer1 and Richard N. Taylor2*1Endeavors Technology, Inc., Santa Ana, CA, 92705, U.S.A.2Information and Computer Science, University of California, Irvine, Research SectionCA, 92697–3425, U.S.A.

Process, workflow and groupware projects both in the commercial and research worlds haveeach approached the problem of human communication and coordination with different setsof tools and technologies implemented from different perspectives. Activity dependenciesand the philosophical assumptions about how work is performed are managed differentlyin each discipline. While most of these tools represent promising approaches, they have yetto be widely adopted, particularly in the face of increased usage of online informationthrough the WWW and computer networks. Electronic information coming to the individualand into the corporation is becoming increasingly unmanageable. People are able to createinformation regardless of their location or method of access due to increasing mobility andinteroperability of the tools they use. Because of the common concerns, trends are movingtowards overlap and convergence of requirements. However, these communities have yet toidentify the union of cross-discipline requirements for a supporting technical infrastructureaddressing the level of flexibility, scalability and support for evolution over time requiredin these real-world settings. This paper surveys the current approaches to projectcommunication and coordination, details the range of requirements across several relateddisciplines, and identifies various tradeoffs and trends which may effect the adoption, design,and evolution of an advanced technical workflow infrastructure. Published in 1998 by JohnWiley & Sons, Ltd. This article is a US Government work and is in the public domain in the US

KEY WORDS: workflow; process; groupware; requirements; tradeoffs; trends

*Correspondence to: Richard N. Taylor, Information and Com-puter Science, University of California, Irvine, CA, 92697-3425,U.S.A. Email: taylorKics.uci.edu

†Effort sponsored by the Defense Advanced Research ProjectsAgency, and Air Force Research Laboratory, Air Force MaterialCommand, USAF, under agreement number F30602-97-2-0021.

Published in 1998 by John Wiley & Sons, Ltd.

The U.S. Government is authorized to reproduce and distributereprints for Governmental purposes notwithstanding any copyrightannotation thereon. The views and conclusions contained hereinare those of the authors and should not be interpreted as necessarilyrepresenting the official policies or endorsements, either expressedor implied, of the Defense Advanced Research Projects Agency, AirForce Research Laboratory or the U.S. Government.

Page 2: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

1. INTRODUCTION

Workflow management technologies allow peopleand organizations to automate, manage andimprove the processes that govern interpersonalcoordination and collaboration. Processes are theseries of day-to-day activities in which peopleparticipate and may involve such diverse tasks asplanning a project, submitting a travel receipt forreimbursement, getting approval for a proposal,circulating a memo for feedback, tracking cus-tomers or writing software. These tasks by them-selves are complex enough when performed by anindividual, but adding the overhead of communi-cation between several individuals makes themanagement of these processes difficult. The infor-mation needed to accomplish any particular activitymay be dispersed around an organization anddifficult to collect and use. Likewise, a workflowparticipant may not know which other participantsdepend on the documents or information artifactsbeing produced or the next step in accomplishingthe task at hand.

What tools do process stakeholders, such as end-users or managers, have to really manage theseprocesses? The answer is threefold. The academicand industry disciplines of (1) software process, (2)basic workflow, and (3) groupware-based computersupported cooperative work (CSCW) approach theproblem with different sets of tools implementedfrom different perspectives. Each manages thedependencies between activities differently inaddition to making different philosophical assump-tions about how the system is used to accomplishwork. While these systems represent a promisingapproach to coordination, they have yet to bewidely adopted. Usage has shown that no singlesystem brings together all the features needed toaddress the level of flexibility and evolutionrequired in a real world work context. This papersurveys the current technological trends in each ofthese disciplines, discusses several of the leadingsystems in the area and extracts the relevantfeatures of representative systems in order to definethe properties needed in an advanced workflowmanagement system and the criteria for evaluatingits usefulness.

This first section provides a brief introductionand a roadmap delimiting the scope of the paperwith respect to current disciplines addressingcoordination, communication and control. SectionPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

126

2 defines current technological shortcomings, pro-vides a brief historical accounting, and outlineskey differentiators between an advanced workflowmanagement system and other approaches involv-ing process, workflow, groupware, managementinformation systems, project management systemsand commonly used software productivity tools.Section 3 analyzes an advanced workflow manage-ment system’s requirements including a detailedlook at support for multiple stakeholders, incremen-tal adoption, object-oriented workflow, externaltool integration, customization and reuse, distri-bution and dynamic change. Section 4 exploressome of the tradeoffs between the requirementsenumerated in the previous section, identifies someof the most prominent trends, and illustrates theimpact in the design of several systems withrespect to these criteria. Section 5 concludes.

2. WHAT IS ADVANCED WORKFLOW?

Advanced workflow is a amalgam of disciplinescombining technologies from the domains of tra-ditional or basic workflow, software process,groupware and computer supported cooperativework, management information systems, businessprocess management, project management, and acommon usage of widely available commercial off-the-shelf (COTS) software tools including drawingand spreadsheet programs. Advanced workflowtechnologies differ from traditional offerings inthat evolution is embraced as a fundamentalphilosophy in the design, deployment andimplementation of workflows supporting com-munication, coordination and collaborationbetween individuals, groups or organizations.

Workflow and process technology has the poten-tial to provide major benefits with regard toinformation access, understanding and efficiency.Information flow between stakeholders is critical.Workflow and process infrastructure provides themeans to support seamless and timely transfer of,access to, and manipulation of pertinent infor-mation. Placing the right information at the rightplace and time is crucial for uninterrupted workincluding cooperation and collaboration. Dis-covering, modeling and visualizing processes isalso a precursor for improving daily work byallowing complex, error-prone or tedious tasks tobe understood and automated. While traditional

Page 3: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

workflow management technologies are useful,they have yet to be widely adopted in areas wherethey are most needed. The current generation oftechnologies generally:

I assume a fixed user model;I require an all-or-nothing buy-in;I lack the execution level semantics to do optimiz-

ation and analysis;I are point solutions, not only for a domain, but

over time;I provide only limited types for modeling of pro-

ducts;I embed inflexible control policies which occlude

reactive control, graceful exception handlingand decentralized coordination.

2.1. Genealogy of Disciplines

In the early 1900s Frederick Taylor called publicattention to the problem of ‘national efficiency’ byproposing to eliminate ‘rule of thumb’ managementtechniques and replacing them with the principleof scientific management (Taylor 1911). Scientificmanagement reasons about the motions and activi-ties of workers and states that wasted workcan be eliminated through careful planning andoptimizations by managers. While Taylor wasconcerned about the inputs and outputs of manufac-turing and material processes, computers and theinformation age brought about parallel conceptsin the management and organization of informationand its processes. Work efficiency could now bemeasured not by bricks or steel, but by theirinformation flows. Information, in addition, hasanother dimension, namely quality. In the earlydays of information technology, it was seen as atool for management to improve decision-makingby evaluating alternatives (Simon 1981). As com-puters spread to the desktop, this view evolved toallow workers to make more informed decisions.The problem then became how to place the rightinformation, in front of the right people, at theright time. While traditional workflow, softwareprocess and CSCW share this goal, their ancestrydiffers significantly. Only in recent years have themorphological differences been put aside to tryand find convergence between the tools, techniquesand disciplines.

Traditional workflow was first considered inimage-based systems in order to help automatePublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

127

the flow of paperwork through an organization.This idea led to the concept of the paperless office.Image-based systems usually received documentswhich were then scanned into the system asimages. These images could then be stored andhanded off based on their number or other metricsto perform work balancing. The problem thoughwas that more intelligent routing and schedulingwas difficult because the contents of the imagescould only be deciphered by the human agentsdoing the work. Forms-based workflow supplantedmost image-based systems in that forms couldcontain editable fields to allow multiple changesthroughout the path of the document. Intelligentrouting is accomplished by examining values inthe form and determining the next appropriateaction and route based on processes and rules.Business process re-engineering (BPR) (Gruhn andWolf 1995, Hammer and Champy 1993) usesbusiness rules to determine actions and minimizemis-directed energies. Management informationsystems (MIS) route documents along fixed pro-cedural routes and encourage adherence to theworkflow process. Winograd and Flores (Flores1979, Winograd and Flores 1987) developed thetheory of coordination and communication tomodel how people honor, request and makecommitments. By proposing the concept that worktakes place as a ‘cycle of coordination’ betweenindividuals, groups and organizations, the percep-tion of work was changed from a data-processingparadigm to computer support for human activities.Modern day workflow systems carry on thisdistinction and attempt to merge the modeling ofhuman and computer executed activities, improvecoordination (Bergman 1996, Malone and Crowston1990, Nutt 1996) expand on the current personalcomputer and network technology infrastructureto support evolution of activities.

CSCW, also known as groupware, has receiveda significant amount of interest over the pastdecade. CSCW, which combines research in boththe computer and social sciences, attempts tocapture the inherent complexities of collaborationbetween people. As the adoption of workflow andoffice automation systems was limited due tooverspecialization of work procedures (Ellis 1979,Ellis and Nutt 1996) when they were first intro-duced, research turned to less constrictive modelsthat encouraged collaboration. The developmentof such systems has been stilted by underestimating

Page 4: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

the dynamic, evolutionary nature of most collabor-ative activities. CSCW encompasses a broad listingof technologies. The most successful have been theoriginal technologies of electronic mail (email) andmessaging billboards (bboards), but inroads havebeen made in group decision-making (Kraemerand King 1988). Modern day systems such as LotusNotes (Lotus Development Corporation 1997) carryon that tradition with their core technologiesbeing organization and presentation of electronic,asynchronous messaging. Synchronous techno-logies such as video conferencing, shared white-boards and internet telephony, while interesting,have been supplanted in favor of simpler techno-logies that work across a variety of platforms suchas ‘chat’ (a one-to-many text-based communicationtechnology) or ‘talk’ (one-to-one) (Grudin 1988,Grudin and Palen 1995). Because capturing realwork practices was the goal of most CSCW systems,temporal ordering of activities, task dependencies,or the dynamic manipulation and scheduling ofthese constructs and work models was excludedfrom many early systems. While these systemsplaced minimal structural requirements on theparticipants’ work, they required a fixed collabor-ation model. Systems like Object Lens (Lai et al.1988) added a dynamic aspect to collaboration.wOrlds (Fitzpatrick et al. 1995, 1996) augmentedthis by allowing multiple, dynamic collaborationmodels to be chosen by the participant duringthe planning and execution of activities beingperformed and by adding the dimension of tem-poral trajectories as the work activity progresses.Some approaches to adding structured workinvolve negotiation models (Action Technologies1998) or graphical presentations of collaborativeactivities. The most successful CSCW technologiesare ones where a balance is maintained betweenunobtrusive work models and lightweight, low-cost of adoption and usage versus structured,managed work and reconfigurable collaborationtechnologies. Current research is addressingdynamic evolution. Introspect (Tolone 1994) pro-vides an architectural framework supporting recon-figuration, the Obligations prototype (Bogia 1995,Bogia and Kaplan 1995) provides a continuum offormal to informal work models and inter-personnegotiations, and AnswerGarden (Ackerman andMcDonald 1996, Ackerman 1994) exhibits self-organizing evolution including reflection (Dourish1995) of the content and context determination.Published in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

128

Each supports dynamism across a wide range ofcollaborative technologies including calendaring,scheduling, notification and communication tech-nologies.

A major catalyst for the identification of softwareprocess as a research discipline was the 1987proclamation that ‘software processes are softwaretoo’ (Osterweil 1987). This ‘shot’ heard ‘roundthe software world’ launched dozens of researchprojects to explore the interactions between bud-ding technologies for managing the complexity ofsoftware with budding technologies for managingthe complexities of human communication andcoordination in building these information artifacts.A similar impact was the ‘Spiral Model’ (Boehm1988) which instituted a shift towards evolutionarysoftware development to more accurately reflectthe dynamic and incremental nature of softwaredevelopment processes as they are executed. TheArcadia (Kadia 1992, 1995, Taylor et al. 1988) projectwas a research group studying process-basedenvironments and identified several key areasfor supporting them. Among them are processprogramming (Sutton et al. 1995), object manage-ment (Tarr and Clarke 1993) and user interfaces(Taylor et al. 1995). Interaction between these initiat-ives became apparent in second generation projects.Teamware (Young 1994, Young and Taylor 1995)intermixed user interface and process programmingresearch; Process Wall (Heimbigner 1992, 1996)intermixed object management and process. Later,these areas were expanded to include hypermedia(including the WWW) and software architecturesleading to third generation environments off-shooting from the project (Bolcer and Taylor 1996,Sutton and Osterweil 1997). Concurrent to thisproject, but similar in scope, was the EurekaSoftware Factory (ESF) (Fernstrom 1991, Fernstromand Ohlsson 1991). ESF was a large research effortto study technologies for providing support tolarge software factories including their lifecycleprocesses and reference architecture. The consor-tium developed various process formalisms com-bining them with other technologies: specializedPetri net models (Gruhn and Jegelka 1992, Peterson1981); software data modeling (Hubert 1991), anddocument objects and rules (Emmerich 1991, Junk-ermann and Schaefer 1994). Similarly, the Atlantis(Kaiser 1997) project is exploring issues in process-centered environments with CSCW (Kaiser et al.1998b, Kaplan et al. 1997).

Page 5: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

Other independent initiatives introduced otherkey concepts. Early process modeling languageswere borrowed from software development suchas dataflow definitions using control and dataflowtechniques (Broy 1985, Fosdick 1976), IDEF orSADT, entity-relationships (Chen 1983, Gruhn 1994)and state charts (Harel 1987, Kellner 1991). Useof logic languages (Heimbigner 1989) were alsoexplored. Flexible rule-chaining, and later flexibletransaction support via WWW access and toolintegration, was also pursued (Barghouti et al. 1990,Kaiser 1989, Kaiser et al. 1998a). The abundance offormalisms resulted in metamodels to assess theappropriateness of each (Clough 1993, Song andOsterweil 1994) and detailed looks into executionrequirements of a process (Ambriola et al. 1990,Estublier 1996). Further research led to the proposalof a process execution reference framework similarto an operating system (Conradi et al. 1992), butfurther studies and experiments later highlightedthe differences between human and non-humanexecuted operations (Young and Taylor 1994).ProcessWeaver (Fernstrom 1993) allowed graphicaldepiction of processes with some relaxing ofsemantic formalisms and their representations lead-ing to further projects in visual modeling andspecification understandable by non-technical aswell as technical users (Barghouti and Arbaoui1996, Barghouti et al. 1995, Young and Taylor 1995).Integration of tools and software using multiplelanguages was explored in Maybee et al. (1996).Slang and SPADE introduced the concept ofreflection and virtual machine layers (Bandinelliet al. 1992a, 1993b). Later systems explored supportfor dynamic change and evolution in a processsystem (Bandinelli et al. 1993a, Fuggetta 1996,Fuggetta and Ghezzi 1994). Further explorationsin dynamism and evolution included Heimannet al. (1996), Jaccheri and Conradi (1993) andKobialka and Gnedina (1995). Endeavors (Bolcerand Taylor 1996) added in concerns of adoption,context adaptation, and transportable, lightweightmobile processes.

Many commonalities are starting to becomeapparent across disciplines (Dimitrios et al. 1995)such as explicit, commutative work models(Suchman 1983) and a need to move beyondindividual automation to group coordination andcollaboration (Kling 1991, Sachs 1995). One thingthey have all shared is lack of widespread accept-ance for all but the simplest of systems, even inPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

129

their home field of software. Adoption issues havebeen studied in each respective area, and whilestrategies and guidelines for successful adoptionare available in all three disciplines (workflow andoffice automation (Blair 1981, Ehrlich 1987), process(Christie 1994, 1995, Christie et al. 1997), CSCW(Markus and Connolly 1990)), no single systemhas been flexible enough to address the broadspectrum of concerns.

2.2. Technology Limitations and Confounds

Traditional online work solutions come in variousflavors with overlapping limitations which preventtheir effortless adoption into a work environment.In addition, the applicability of these technologies isbeing demoted as work environments are becominginterconnected, allowing lightweight disseminationof information and artifacts using various net-working technologies such as the WWW and theInternet. One such domain is virtual enterpriseengineering (VEE (Fielding et al. 1998)). Virtualenterprises require the need to evolve even to thepoint of swapping out organizational divisions,groups and people, while at the same time guaran-teeing precise, consistent interactions with highlevels of assurance. Management and measurementinfrastructure needs to be in place to guaranteethat at any given point in time, the best interestsof the virtual enterprise are served, usually withrespect to trust, quality, time-to-market, profitabil-ity, efficiency and other variables. Various tra-ditional technologies do not live up to this taskbecause of the following limitations and confounds.

2.2.1. Process TechnologyProcess technologies assume closed world dataconsistency. This is usually enforced by limitingthe actions of the user or agent to those only pre-specified by the process designer. In a dynamic,evolving real-world environment, sometimes it isnecessary to jump out of the process or out of thesystem to adhere to the specified process model.Process models that are over-specified are difficultto follow because their requirements for completionare too rigid. Process technologies that supportevolution as a mechanism to address this problemrequire that changes are monotonic, i.e. eachsuccessive change to the workflow model is addi-tive, incremental and always strongly dependentupon previous models for data integrity and

Page 6: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

consistency. Synchronization between a processmodel, its data, and offline activities is difficult insituations where a participant’s activity divergesfrom the activity descriptions. Often this resultsfrom an exceptional case, escalated priority, orwork redefinition. Consistency sometimes must beviolated to meet both functional and non-functionalrequirements of the process being followed or theproduct being produced. Process models assumea uniform representation. Coordination betweendispersed process participants is sometimes diffi-cult because a uniform representation of activities,artifacts and resources may differ among people,groups and organizations. Adding the complicationthat various participants possess differing skills,different levels of understanding of their obli-gations, and may require domain and contextspecific views, misrepresentation and miscommuni-cation is often the result.

2.2.2. Workflow TechnologyTraditional workflow technologies have achievedrelative success in the workplace through simplifi-cation. These systems have been applied to prob-lems of limited scope and scale, usually automatinga single point task as in an ad hoc workflow, or asmall series of never changing steps as in aproduction workflow. Workflow specifications areambiguous or contain inflexible semantics. Non-graphical workflow specifications often are hard-coded to include both the context and controlpolicies. Understanding of the overall process islimited to those who can delve into the textualdescription. Graphical representations allow someabstraction but at the same time introduce ambi-guity. Visual workflow programming languagesoften lack representations for timing and executionconstraints as well as complex relationship specifi-cation and management between process objectsand people. Lack of relationship management canprecipitate poor exception handling.

Workflow processes that diverge from theintended series of planned activities or requiresome mid-stream changes result in exceptions. Anexception can be something as simple as a missingresource or input; it can also be something moreserious such as a design failure requiring a signifi-cant amount of rework or alternative procedures.Workflow technologies typically lack the infrastruc-ture to query status and change values of theirown processes, correct mid-stream deviations, orPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

130

locate and reserve needed resources and artifacts.All exceptions, whether requiring minor automatedintervention or human participation to correct,often require the reset and restart of a workflowprocess. This rigidity in handling exceptional casesshows up in a typical workflow product’s deploy-ment model. Workflow processes once they aredeployed into their execution context are rarelychanged. Significant changes are difficult toaccomplish in this environment as the workflowprocess model, the execution infrastructure andthe process presentation are tightly coupled.

2.2.3. Groupware and CSCW TechnologyGroupware and CSCW are broad labels thatdescribe a gamut of synchronous and asynchronoustechnologies including email, shared whiteboards,meeting schedulers, collaborative desktops, videoconferencing and other shared electronic media.While these technologies are useful for overcomingcommunication problems over time and collabor-ation problems over distance, they lack the guid-ance and automation mechanisms for performingstructured tasks. While using these tools toaccomplish unstructured tasks mimics some realworld work environments, they do not guaranteeconsistency of results, maintain a standard ofpractice and procedure, or lend themselves tooptimization, improvement and training. There isno explicit work model other than ad hoc utilizationof the components at hand. There is no measurablestatus for determining completion, progress of atask, or even mechanisms for describing whatwork needs to be done other than capturingcommunication and collaboration relationshipsbetween participants. Adding a work model andmanagement capabilities to a groupware or CSCWtechnology often leads to additional work on thepart of the participant with no direct benefitssuch as guidance or automation. Synchronizationbetween the proposed work and the actual workis often done by hand. This overhead of modelmaintenance may lead to the demotivation of theparticipants which is crucial to the success ofan activity.

2.2.4. Management Information SystemsManagement information systems (MIS) aredomain specific software solutions for managingthe dissemination and control of information acrosslarge organizations. These systems are custom-

Page 7: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

built for specific information and automation needs,including domain and task specific structuring ofdata and documents, as can be seen in healthmaintenance or corporate tax preparation. A largeportion of MIS is simply translating or reformattingthe data and documents into a format applicableto each specific task or context. Control flowbetween people and groups is usually embeddedin the implementation of the system, and handoffof data artifacts is only discharged along pre-ordained communication channels. The ability ofthe MIS to adapt to changing work procedures isdifficult because components are not readily reus-able and changes to a workflow task or datastructure often have system-wide consequencesmaking the cost of change prohibitive. Changes inexecution behaviour may require re-coding ofdata translators, user interfaces and executionscheduling and semantics. Because of this fixed-execution model, evolution of the MIS over timeis done through whole-system upgrades. Diver-gence of a participant’s work from the specifiedtask or structured data is usually not captured andswept under the rug until the benefits outweighthe cost of changing the software by the informationtechnology (IT) staff and not the participants them-selves.

2.2.5. Project Management SystemsProject management systems model and trackthe progress of user tasks against dates, times,priorities, costs and precedence dependencies.Tasks can be resource driven or effort driven, anda critical path to completion can be specified usingdifferent cost and time constraints. While someproject management systems have multiple viewsinto the work model, most are applicable to asingle stakeholder, namely the project manager.This mismatch reduces the burden of status collec-tion for the manager, but increases the effortneeded to keep the status reports and state changesreliable for the project participants. End-users thatderive no direct benefit from using the systemhave no incentive to maintain the data it uses.This additional work may lend to the divergenceof the work model and the actual status. Anotherproblem with project management systems is thatthe formalisms they use overconstrain the workspecifications with respect to time and resourcedependencies. Accuracy of this information changesthe further away project projections are made fromPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

131

the present. As the project progresses, the planusually remains fixed, and evolution causes inaccur-ate data and high maintenance, and the ‘best laidproject plans’ are cast aside.

2.2.6. Common UsageCommon usage describes a large class of desktopsoftware that is used in the day-to-day informalplanning and execution of workflow processes.These include drawing programs, spreadsheets,batch files and shell scripts. Most non-technicalusers can visually describe processes using adesktop diagramming tool, analogous to a casualdrawing on the back of a napkin. The diagram isthen used as a means to communicate the intendedprocess as an informal flow chart, series of sketches,or graphical checklist. Execution mechanisms tosupport such workflow specifications are non-existent. Execution takes place by hand, andwhatever automation that is incorporated into theworkflow occurs by implementing the descriptionusing a scripting or macro language by a technicaluser. The problem with this is twofold. There is acomplete separation between process model andprocess execution in that changes in one or theother cannot be reflected back in order to updatethe ongoing status. Secondly, workflow processdescriptions lack formality and expressibility whichcauses miscommunication of tasks, confusion, aninability to perform analysis, an inability to deter-mine status, and lack of agreement on commonrepresentations, task definitions, shared artifacts orrequired resources.

3. ADVANCED WORKFLOWREQUIREMENTS

Because workflow involves the interaction ofhumans and computer processes to generate andshare information, its requirements need to reflectboth the technical as well as social and organiza-tional needs. While some process, workflow andgroupware systems excel at particular subsets ofthese requirements, they have been limited in theirlevel of flexibility, scalability and evolution overtime. This section draws upon the current researchacross various disciplines and assembles the keyrequirements assuming an information-rich andhighly-connected world. Support for multiplestakeholders, incremental adoption, object-oriented

Page 8: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

workflows, integration of external tools, customiz-ation and reuse, distribution and dynamic changeare all explored in detail.

3.1. Support for Multiple Stakeholders

Support for multiple stakeholders is important inan advanced workflow system (Abowd and Schilit1997, Kraemer and King 1988, Young and Taylor1994). Stakeholders may have backgrounds insystems integration, business process modeling,process improvement and optimization, humanfactors, or even domain specific knowledge aboutthe domain in which the workflow process is beingapplied. Stakeholders may be technical or non-technical. Interaction with the workflow systemmay be explicit, where the series of activities anddependencies are viewable by the end-user, orimplicit, where the end-user is concerned with thelocal activities and may not be aware that they arepart of a larger workflow process.

Each of these stakeholders may require a differentpresentation on the activities at hand. The interfacespresented to the end-user participating in a work-flow process during execution is the work context.This may include interactions with tools inde-pendent of the workflow system. In the case wherethe end-user only interacts with the tools withwhich they are familiar to accomplish their work,the user is participating in an implicit workflow.An implicit workflow usually has no formaland manipulable view available to the end-userconcerning how their work fits into the largercontext. Information about where to find theartifacts needed to accomplish their work is usuallyinformally specified or left up to the individual.Designing an implicit workflow requires the pro-cess designer to anticipate the tools and datarequired to accomplish the activity. This infor-mation, as well as the technical details of integrationand tool invocation, are often specified in theworkspace management component of the work-flow system. The advantage of an implicit workflowis that in a smoothly running, repeatable workflowprocess, the end-user is unburdened with thecommunication overhead. Details of activitiesabove and beyond the immediate task areabstracted away. To implement an implicit work-flow, it is often necessary to present the informationin the application domain. This requires customiz-Published in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

132

ation of the views into the system either from theprocess designer’s or end-user’s perspectives.

While implicit workflows are useful for hidingdetails of a workflow process not appropriate tothe end-user, making them explicit allows theparticipant to see how their activities fit into thegreater context of the project. Greater coordinationbetween process participants can be achievedby allowing end-users to view the data artifactdependencies and locations, tool availability formanipulating them, and explicit definition of whichother participants are relying on their work. Theexpectation of data and document handoff orresource sharing has the self-enforcing effect ofencouraging participants to meet their social con-tract obligations in a timely manner. Addingfunctionality to the end-user system to both under-stand and change the workflow processes in whichthey are participating makes the system flexibleby allowing local optimizations with minimal orno impact on global process constraints. However,when imposed across different parts of an organiza-tion, these constraints may often in fact be conflict-ing. It is important that explicit workflow systemsprovide mechanisms to allow both the local optim-ization, such as multiple solution paths, guidanceand multiple coordinated views, while at the sametime not breaking or disrupting the flow of theglobal process by providing mechanisms for helpand answer, data agents for resolution, enforceablerules and constraints, and a strong separation ofconcerns between the data artifacts and the processby which they are created.

3.1.1. AccessibilitySystem integrators, workflow process program-mers, managers and end-users all may need accessto tools, interfaces, documents, status information,to-do and activity lists, and other resources(Nuseibeh 1997). The WWW has made it possiblefor non-technical end-users to quickly share infor-mation by publishing it on the Web. Intranets havemade it possible to control that information inorder to keep it proprietary and ‘in the family’.While the availability and dissemination of infor-mation aids in the completion of tasks, that alonedoes not provide the framework within which itshould be used. Similar to how non-technical end-users can both browse and author Web pages todisseminate information and reflect task progress,workflow processes should support the same

Page 9: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

capabilities. In addition to this, a wide audienceof process stakeholders should be able to search,browse, initiate, view, edit, annotate, replace, and,at the very least, understand the workflow pro-cesses as appropriate.

3.1.2. Data AgentsData agents are small, autonomous software mod-ules that automatically perform a localized task.Intelligent agents, in addition, have the ability toreason about the input data and manipulate it insuch a way that it can work in conjunctionwith other agents without violating any globalconstraints (Hanks et al. 1993). Examples of activi-ties that data agents perform range from simpleto complex. An agent’s activities may includeautomatic notification via email of the availabilityof a report, sending a reminder of a scheduledmeeting, advising a user of alternative meetingtimes as the result of a scheduling conflict, or evenactively going out and doing research based onsome specified criteria. Agent’s activities, from theviewpoint of the workflow system, should allowthe activity to be accomplished by either a humanor non-human agent or both. By allowing the mixof both human and non-human activities, theavailability of more intelligent agents allows forthe incremental automation of certain activities.The people participating in the workflow shouldbe allowed to hand off or delegate activities to anagent or take ownership of the activity back fromone. Similarly, agents should be allowed to delegatetasks to sub-agents or request the help of humanparticipants as needed. The management of mul-tiple agents allows end-users to accomplish tediousor repetitive tasks with minimal effort and avoidsthe ‘keep checking back’ syndrome. Agents shouldallow customization by individuals with or withouttechnical programming knowledge, notificationbased on existence and changes, constraints onvalues and appropriate strategies for resolution,and significant event filtering and topic interestregistration. At some level of abstraction, anagent and an automated workflow process areinterchangeable. Agents historically have beenfocused on non-human involved, small-scale coor-dination activities.

3.1.3. GuidanceGuidance allows the workflow system to walk anumber of process participants through a series ofPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

133

steps to accomplish a goal. Guidance can be usedto teach or train, as in a computer based learningsystem that teaches mathematics or job specifics,or it can be used with trained and highly skilledparticipants to ensure quality of service, quality ofoutcome, or tracking adherence to standards (Deyet al. 1998, Fischer et al. 1992). An example of thisis the health care industry where the workflowsrequire an efficient flow of precise information andsteps available to doctors, nurses and adminis-trators in order to ensure a certain standard ofcare is met.

Guidance should be flexible and configurablefor the work context. Subsequent choices and theirconsequences should be made clear. Decisions onchoosing the next activity are either made by theuser after being advised by the system of theiralternatives or by the workflow system basedon data input and values. End-users should bepresented with information detailing how far intothe workflow they have gone and how far theyhave left. As an example, this is can be done withthe number of questions in a training exercise,‘you have finished 30 out of 40 questions’, or withtime estimates, ‘sending off the results to the labwill take approximately 3 hours’. Guidance isrelated to enforcement. Systems that strictly limitthe choices by the end-user enforce compliancewith the process.

3.1.4. Separation of ConcernsSeparation of concerns allows greater flexibilityand dynamism in the workflow system. Decouplingcomponents of the system allows them to beswapped out, replaced, or even added to thesystem without impacting other parts (Barghoutiand Arbaoui 1996, Barghouti et al. 1995, Griffand Nutt 1997). Strong separations between theworkflow model and its presentation, the workcontext and the process, the process implementationand its design are all important.

In a system where the visualization and theprocess are tightly coupled, it is difficult to providealternative views of the workflow. Design of aworkflow process where the visual representation isthe language with no underlying programmaticallymanipulable model severely limits the interactionsand automations that can be performed with thesystem, especially while the workflow process

Page 10: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

is executing. While graphical representations areuseful for abstracting detail to allow model changesby non-technical users, this abstraction often causesambiguity. This ambiguity is often resolved seman-tically at the model level which requires modelmanipulation independent of its presentation.

Workflow process models and the context inwhich they are executed should also be decoupled.The access and availability of tools, documentsand artifacts to the end-user should be configurableby both the workspace designer or the end-userthemselves. By separating the semantics of theworkflow from how the workflow is executed,processes can be easily reused in new and differentdomains or downloaded across a network andexecuted in the end-user’s work context withoutviolating the constraints of their work culture.The separation of the workflow model from itsexecution context allows late-binding and dynamicloading of tools and resources as needed orguidance and enforcement policies as required. Ina sense, the separation of the work context fromthe process model is analogous to cascading stylesheets (CSS) on the WWW which separates Webcontent from the style guidelines. Similar to howthe HTML renderer makes context dependentdisplay decisions about its presentation, the work-flow execution engine can use the workspacecontext to scope the execution.

Separating implementation from design allowschanges to the process implementation such aslanguage or tool integrations without changing thespecification. Model changes can also be made byreusing implementations or supplying alternativeones. This separation allows late-binding ofresources and flexible execution based on avail-ability of these resources at the time they areneeded. Also, because workflow process contextdiffers across development, deployment andexecution, a separation of concerns minimizeschanges needed to accommodate these stake-holders. In the workflow process world, testing aprocess is difficult without actually deploying orexecuting it. By minimizing the changes made tothe process by being able to switch implementationsand context, it is possible to use the same processfor testing, simulation, feedback and experimentaldesign as well as actual deployment and in-placeexecution of the process. This allows smoothertransition, adoption and reuse models.Published in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

134

3.1.5. Solution PathsThe final outcomes and goals of a workflow processshould allow for multiple solution paths to thatgoal. Completion of a process should not only beproactive, as in a production workflow systemwith work being pushed along by the workflowengine, but reactive as well where new messages,events and actions are triggered by some statechange in the model (Bogia 1995, Bogia andKaplan 1995). Add to the equation exceptions,unpredictability of humans, and the commonoccurrence of work sometimes just going wrong,such as miscommunications, missing or incompleteinformation, it becomes important to allow formultiple solution paths. Complex activities shouldbe configurable to allow a range of options. Theactivity may be left unspecified relying on theingenuity of the end-user to achieve the goal athand. Otherwise several methods and processesfor achieving the goal can be selected by theparticipant or automatically selected based onthe availability of resources at the time. Processparticipants, whether human or automated agents,should be allowed to stray from the intended pathif appropriate. This sometimes will violate globalconstraints such as communication and coordi-nation with other process stakeholders. While it isimportant to provide alternative solution paths toencourage process improvement and optimization,the workflow system must be flexible enough tobalance this with enforcing process obligationsacross organizations and levels of abstraction.

3.1.6. Cross-OrganizationalOrganizations are large groups of people workingtogether to deliver products or services. Thesegroups usually focus on the sub-tasks of creatinga larger product or service and unfortunatelysometimes work at cross-purposes (Fitzpatrick et al.1995). A quality assurance team may focus onensuring a product is sufficiently tested while asales team may want to release a product or serviceto stay ahead of the competition. Organizationsare social constructs whose groups share termin-ology and closely related sets of tasks. Skills foran individual to succeed in one part of anorganization do not necessarily transfer easily intoanother. Likewise, tools, techniques, managementstyles, work ethics and environments vary acrossan organization. This makes it difficult for variousstakeholders to participate in the same process. Two

Page 11: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

individuals from different parts of an organizationoften do not even use the same terminology whenspeaking of the same project, much less share thesame technical infrastructure.

A cross-organization workflow needs to be ableto leverage and adapt to the existing infrastructure.Tools and software should be substituted as neededor else bundled in with the workflow process.Views and presentations should fit the user modeland abstract away unnecessary detail or inverselyannotate it with group specific information. Mis-matches of data or expectations which most oftenoccur at organizational boundaries should be ident-ifiable and allow for automated or human involvednegotiation and resolution. Global rules and con-straints should be enforced at the organizationlevel, but provide enough flexibility to allowgroups to ‘grease the cogs’ of the process to reducebottlenecks or handle exceptional cases. Finally,because of the diverse social, political and technicalinfrastructures, the implementation of the processitself should be cross-platform, lightweight andmobile. This allows fragments of the process to bedistributed and executed when and where theyare appropriate or needed.

3.1.7. Rules and ConstraintsRules and constraints are the mechanisms used tomaintain process and data consistency. Rules areformal or informal, local or global. Informal andglobal rules are often described as business direc-tives such as ‘all timesheets must be turned in ontime or else the disbursement of funds will bedelayed until the next pay cycle’. Other rules andconstraints may alert a project manager when anactivity misses a deadline or falls off the criticalpath. While natural language style rules are easyto specify, they are often ambiguous. The actualimplementation of workflow rules should be separ-ate from the specification and allow for as formala description as is appropriate from the author.

Rules and constraints are separated into con-ditions and actions. Conditions should allow timingand synchronization, flexible pattern or valuematching, and configurable generality and appro-priateness (Kaiser 1989, 1996). Actions effect somenotification or state change in the workflow systemthat may be visible or invisible to the processparticipants. The implementation of the action,similar to the implementation of the workflowprocess, should allow for flexible tool integrationPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

135

and support for multiple implementations whichmay be dynamically chosen. Rules can be manda-tory or optional, and in the second case, end-usersshould be allow to turn on and off the firing ofthe rule. One example of this is email registration.An end-user can register interest in the posting ofa report or document. The workflow system canthen notify the appropriate parties of its availability.

Some workflow processes lend themselves wellto being specified in a rule-based manner. Theflow of the process is governed by the series ofrules that it matches. These types of workflowengines are called reactive systems. The executionand firing of rules should be flexible also. Multipleor even conflicting rule matches should allow forprioritization. Missing or impartial informationshould allow for both forward and backwardchaining, and the number of times a rule is allowedto fire should be configurable. Also, it is importantto allow human interaction in both recognizingconditions and executing actions.

3.1.8. End-User Configurable ViewsWorkflow processes involve the coordination ofactivities, control of artifacts and communicationbetween stakeholders. Each of these may requirepresentation to an end-user in different ways orhighlighting different aspects or importances. Someusers may require domain specific information orlayout while others may become confused ormisled by extraneous or unneeded information.End-user configurable views are needed in additionto stakeholder views in that it allows the end-userto form a customizable context to which they canreturn (Ackerman and McDonald 1996). Processparticipants may require functionality similar tohow a group leaves their scattered papers in ameeting room when they break for lunch in orderto preserve the context and make it easier topick up the meeting when they return. End-userconfigurable views, by allowing the participantsto customize the workspace as appropriate and totheir liking, increases the chances that the workwill be done within the workflow context whereit is easily captured and measured in order toshare status with others dependent on the work.At the same time, it allows the end-user easyaccess to local state and information. This givesparticipants the ability to revise a document ortest code until the point of handoff, as well as the

Page 12: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

ability to coordinate between multiple projectsand processes.

3.1.9. Help and AnswerHelp and answer is important not only when usinga workflow system, but also when using andselecting processes. Help and answer should becontext aware and configurable in the level oftechnical detail based on the role of the participantrequesting help. Related concepts should make useof hyperlinks, and in the case of a question, thesystem should allow that the answer be a workflowprocess also if appropriate (Ackerman 1994). Helpand answer implemented as a workflow helpsdetermine the source of the problem, identify theappropriate resources to use or people to contact,and provide interactive guidance to converge ona relevant answer. In addition to end-user helpand answer, workflow developers should haveaccess to process wizards and templates.

Help and answer should include evolutionarycapabilities as well. This may involve annotationson a process’s usefulness, relevancy to certaindomains, overviews, feedback, or even customiz-ation and reuse instructions. Information shouldbe synchronized and co-evolve with the process.Techniques for accomplishing this include auto-matic generation of system APIs, tight integrationwith hyperlinks and hypermedia, and flexiblesearching and browsing including standard refer-ence locations and libraries. Decoupling of the helpand answer from the actual process allows for thelatest information on the process or product to beupdated over the WWW or other network protocols.Process and task specific tips and troubleshootingshould also be available.

3.2. Incremental Adoption

Incremental adoption is key to the successfulimplementation, deployment and execution of anevolving workflow system and its processes. Pre-vious workflow and software process systems havetaken a top-down, all-or-nothing approach toadoption. While this may work in some organiza-tions, it inhibits these systems from being morewidely accepted. Likewise, CSCW systems oftenrequire a plurality of users before the groupbenefits can be realized. Incremental adoptionallows the benefits gained from using the systemto match the amount of effort and resourcesPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

136

available by controlling the scope and pace ofadoption (Blair 1981, Christie 1994, 1995, Christieet al. 1997, Ehrlich 1987, Grudin 1988, Grudin andPalen 1995, Markus and Connolly 1990).

Workflow systems often require a certain amountof computer expertise requiring training of person-nel to both develop processes and participate inthem. Often this requires changing the workculture, and the technology is seen as getting inthe way of accomplishing their work, significantlyimpacting the adoption and use of the technology.Participants who lack the ability to change oroptimize the process, either because of lack ofskills and training or inflexibility of the system,tend to follow a parallel process duplicating work.Allowing end-users to evolve some or all of theworkflow processes in which they participateamortizes the cost of adoption across all of thestakcholders but at the same time distributesthe rewards.

Process stakeholders fall into the trap of duplicat-ing work, both online and offline, because thetools they need to accomplish the task at hand aredifficult to integrate with the workflow system.Also, process objects are difficult to reuse becausethey are usually over-specified and only relevantto a particular process, project, or individual workcontext. Few departments, groups, or individualsare willing to pay the overhead costs that benefitothers down the line and are unable to incorporateglobal charges of time, money, or effort into currentprojects due to immediate resource constraints.Workflow objects that represent activities, artifacts,or resources tend toward special case behaviorsbecause of context customization necessary forexecution. The maintenance, configuration anddeployment of all these objects becomes difficultto manage and track. The cost of deployment ofthese workflows across diverse machines, protocols,languages and installing minimum and consistentprocess tools and technology on every relevantdesktop is very high. There are very few tools todistributedly manage all the necessary compo-nentry, and most workflow systems are closedor difficult to customize because they have noopen APIs.

3.2.1. ScalabilityWorkflow systems not only need to supportindividuals, groups, large organizations, or anentire enterprise, they must also gracefully scale

Page 13: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

with the number of participants and the size,complexity, scope and purpose of the project overtime. Performance is always an issue, but moreimportantly, end-users require access to infor-mation that is timely, appropriate and easy to find.Abstraction is the key concept. End-users need tobe isolated as much as possible from problemsarising with the addition of significantly moreusers. Workflow systems can address this byhierarchically abstracting the workflow and itsrepresentations. In order to maintain consistentand coherent project data across large or increasingnumbers of people, details that may embodylarge, multi-person workflows themselves can beencapsulated. Workflow systems should supportscaling to meet the needs of the process developersand maintainers in addition to the end-users. Thiscan be done with process support tools similar tohow complexity in software is handled. Require-ments and rationale tracking, debugging and analy-sis tools, optimization, reuse, standardization, ver-sion control and banding groups of functionalityat the same level of abstraction into a virtualmachine are all techniques applicable to scalingworkflow processes as well as software.

3.2.2. Entry BarrierThe entry barrier is measured by the amount ofcustomization, administration and installation costsrequired before some benefits can be derived fromputting the workflow system into use. This involvesdomain customization where the objects, compo-nents, and views of the system are tailored tothe specific target domain, such as health care,electronic commerce, finance, manufacturing,software development, or others. Each domainshould share the core infrastructure of the workflowsystem but include one or more application layersbuilt on top. By evolving the workflow infrastruc-ture to accommodate shared capabilities acrossdomains or even concerns within a domain, theamount of customization required for changes andadoption becomes less. New uses of the technologycan be targeted by adding a thin applicationlayer and reusing activity, artifact and resourcedefinitions as well as workflow processes if appli-cable. Reuse can be facilitated by providing extend-able and customizable user interface and workflowcomponents in an online marketplace setting.

Administration and installation costs also raisethe entry barrier. Deployment costs of installingPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

137

minimum and consistent process tools and techno-logies on every relevant desktop is problematic.Add to this the fact that stakeholders may requirediverse machines, protocols and implementationlanguages, all of which may lack cross-platformcoordination and standard use capabilities, the costof deployment begins to outweigh the perceivedbenefits. In addition to the workflow executioninfrastructure, the workflows themselves need tominimize the entry barrier to their use. Both ofthese can be addressed by a lowest commondenominator approach. The WWW follows thisapproach because of its ubiquity, and end-usersleverage their training and experience with hyper-media and hyperlinks. This also makes it possiblefor end-users to participate in a process with noexplicit installation. Adding a mobile applet orscripting execution language (similar to Java,Python, Perl, or Tcl) and supporting text-basedrepresentation with an explicit, well-defined work-flow language allows workflows to be deployed,invoked and monitored using standard networkprotocols such as HTTP (HyperText Transfer Proto-col (Berners-Lee et al. 1996, Fielding et al. 1996a) orSMTP (Simple Mail Transfer Protocol).

3.2.2.1. Inform/Automate Workflow systems provideinformation to the participants needed toaccomplish a task but may also automate thesetasks based on available information. Workflowprocesses intended for execution are most easilydeployed initially as inform only. This allowscoordination and communication between end-users but leaves the execution of the activitiesto the discretion of the individuals. Automationcapabilities can then be incrementally added byaddressing the highest payoff areas first. Informtasks may include flexible access, searching, sorting,prioritizing, gathering, editing and so on. Auto-mation tasks may include collation, routing, syn-thesis, format conversion, notification and updateamong other things. Intermixing the informationand automation aspects of a workflow system iscrucial to its success. The automation tools mustbe able to recognize the results of the participant’swork such as activity and completion information.Likewise, participants need to be able to easilyutilize the automation capabilities of the work-flow system.

Page 14: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

3.2.3 Work/Reward MatchingWorkflow may not benefit those individuals whodirectly use it. It is the group they are part ofthat benefits through improved coordination andcommunication. The degree to which end-usersbenefit from using the system in balance to theamount of work required to maintain consistentdata and state for a process is called work/rewardmatching. Care should be taken to ensure thatmismatches are not encouraged in the system.Individuals who have the ability to formulateprocesses to automate their own tasks match theirwork with the benefits gained from it. Groupsrequire matching also. Because group benefits areshared among the members, it is important not to‘saddle’ individuals with the majority of the work.Similarly, shared work that benefits only a singleor small number of individuals within a groupmay inhibit adoption as end-users may be unwillingto put the time or effort into using the tool if thebenefits are perceived to be only accomplishingthe work and goals of the few. The best approachto work/reward matching is to distribute the workand the benefits evenly among the members of aworkgroup. This means that the workflow systemmay require development of processes by non-technical as well as technical end-users. Further,the cost to deploy and execute a process shouldbe commensurate with the eventual benefits fromthat process. A simple, commonly used workflowshould require minimal skills and effort to cus-tomize, deploy and execute.

3.2.4. MaintenanceProcesses, as well as the means to execute them,change and evolve over time. The ability toincrementally change a process regardless ofwhether it is executing or deployed is a maintenancetask. Maintenance involves versioning and con-figuration management for both the workflowprocess and the state it keeps. The difficultyencountered to evolve, distribute and deploy theprocess and state over time while making thesechanges should be minimized. Tools to aid inconfiguration and customization of the workflowsystem are the easiest way to reduce maintenancetasks. Also, workflow processes should be builtwith the assumption that they will change overtime. By building in a maintenance process intothe workflow system, the effort to checkpoint,Published in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

138

upgrade, evolve and even recall a process isreduced.

3.2.5. AdoptabilityAdoptability targets how well a workflow systemintegrates in with existing work culture and techni-cal infrastructure. Work cultures that have hadvery little exposure to automated technologies maylack the sophistication to immediately adopt aworkflow system. Training is an inhibitor, but alsotrust. Workers first of all need to recognize theutility of the system and then trust it enoughto delegate or take ownership of tasks usuallyperformed by hand. As the reliability and usageof a workflow system increases in an organization,the sphere of its applicability expands. Workflowsystems should support a continuum of tasks fromthe simple to the complex in addition to seamlesslyintegrate activities performed by both humansand computers.

Technically, the workflow system is more easilyadopted by using the tools and infrastructure thatis in place. For example, systems that require real-time distribution of data and immediate feedback ofresults may be difficult to adopt in an organizationwhere significant numbers of systems are intermit-tently used, disconnected from the main networkservices, or require special tools or software.Eventually, as the technology becomes integral tothe organization, technology infrastructure can beevolved to include new requirements, but again,the lowest common denominator approach toinfrastructure ensures adoptability.

3.2.6. GranularityGranularity of both the workflow system and theworkflow process affects adoption. Large-grainedsystem components may encompass related groupsof shared functionality such as persistence ordistribution into a server, thus removing the needfor duplicate infrastructure in the clients. However,packaging functions in this way may reduce theapplicability of the system by adding in unneededservices, fixing the user model, or advocatingexecution, security, or distribution policies thatmay be difficult to reconcile with the work culture.Fine-grained and lightweight, interoperable compo-nents increase the flexibility of the system withrespect to incremental adoption. Small, targetedsets of components allow evolution of the systemto address new priorities and goals of the organiza-

Page 15: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

tion. Fine-grained components offer small, efficientmodules of functionality, and are more easilyexchanged or upgraded with less impact on otherparts of the system. Maintenance of the workflowsystem is also easier for both the end-users andworkflow developers through a series of small,incremental changes as it reduces the complexity.Often evolution of a system by introducing newcomponents will result in inconsistency of eitherthe data or tools requiring some amount of rework.By using fine-grained components, it becomeseasier to isolate and address the impact of thesechanges. Workflow primitives should be fine-grained also. While these primitives can be encapsu-lated or abstracted into larger grained processes,the activities, artifacts and resources should beinterchangeable, allowing for process reuse.

3.3. Object-Oriented Workflows

Workflow objects should be defined in an object-oriented manner. The basic unit of the workflowshould be object definitions ideally abstractedfrom real-world structures. This provides a clean,conceptual one-to-one mapping between the real-world objects and the model objects maintainedwithin the system. Object-orientation, by usinginformation hiding and abstraction, makes theworkflow easier to understand by end-users andto maintain by developers. This approach allowsfor a consistent work model across human executedand computer executed activities even as thedefinition and understanding of these tasks evolveover time.

Object-oriented workflows, in addition to encour-aging consistency, facilitate the clean definition ofinterfaces, enhancing the ability of the activities,artifacts and resources to be reused. Designing andimplementing a workflow in this manner focuseson the definition of the data and the interfacesneeded to manipulate it. This data is combinedwith the behavior specifying how to create ortransform particular structures and values of thedata to form a workflow object. Independentmanipulations of both the data and behaviorsfacilitate reusing object definitions for variousends using object-oriented mechanisms such asinheritance, polymorphism, concurrency andencapsulation. Object-oriented workflow designmakes it is easier integrate tools to manipulate theobjects while hiding their implementation andPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

139

details. This allows the workflow to be built inincremental stages focusing first on the criticalparts of the workflow’s activities and encouragesalternative or evolving implementations. Cleandefinition of interfaces also allows a control pointfor access. Different roles in an organization mayrestrict access to sets of artifacts such as documentsor data. In addition, the activities to create, defineand revise the workflow artifacts can be limited tothose with the appropriate permissions or security.

3.3.1. Process and Development ToolsWorkflow systems should have available to variousstakeholders a multitude of tools to help manage thedesign, development, deployment and execution ofa workflow process. These tools should allow thestakeholder to manipulate the conceptual model ofthe workflow at all stages. Workflow developmenttools should mimic the capabilities of softwaredevelopment with the exception that the targetlanguage is a process modeling language.Implementation of the workflow process alsoshould easily leverage software development andintegration technologies. Testing and stimulationtools should be available to the process architectfor pre-deployment execution or simulation toolsto verify the process with respect to its anticipatedtarget environment and end-user’s requirements.Execution tools and technologies should aid inmanaging changes while the process is running orto provide feedback to help guide in its executionand evolution. Finally, the workflow system shouldfit seamlessly into the end-user’s environment,being invisible when required. Support for thetools should be flexible and configurable by theend-user, but also provide basic services for themost common tasks across all stakeholder’s day-to-day work.

Various technologies recently adopted in thesoftware field are easily re-targeted to the workflowdomain. Visual programming is a promising tech-nology in the development and design of workflowprocesses because it allows non-programmers tomanipulate a program model at a high conceptuallevel. Distributed execution and resource levellingof program execution also map well to the workflowdomain because of the inherent attributes ofindividuals to work independently and concur-rently. Scripting and interpreters allow incrementalexecution and interactive changes. Parsing andgeneration tools are also helpful for managing

Page 16: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

a process’s implementation allowing round tripchanges from implementation code to design orvice versa. Both workflows and their implemen-tations may easily benefit from component reuselibraries, process templates and rapid prototypingand design tools and environments.

Workflow processes cannot always be testedbefore being deployed or executed because of thecomplexity of the process, the necessity of humaninteraction, participation of human executed activi-ties in differing roles, and the prohibitive cost andeffort to build and maintain separate executioncontexts for testing and simulation. To aid theprocess developer in delivering a more effectiveprocess, tools for process testing and analysissuch as syntax checking, workflow path coverage,exception handling and even correctness of theworkflow are necessary. These tools ensure thatthe design of the workflow is sound, and theyshould complement the traditional software testingand analysis of the workflow’s implementation.Bottlenecks in the workflow should be flagged aswell as issuing warnings for incomplete or missingitems. Control, data and resource flow valuesshould be verified for existence, definition and use.Also, standard project management analysis toolsshould be used to check the feasibility of theprocess with respect to time and budget constraints.

Once a workflow is deployed, execution toolsshould be available to the process and systemprogrammers to make changes to the runningprocess. Execution tools may include remote orpeer interpreters in order to inspect or change thestate of objects, agents for capturing evolvingstate and process interactions, version control ofprocesses and workspaces, printing, searching,browsing, and on-the-fly tool management andintegration. These services should be used togetherfor evaluation of the process to measure theperformance and provide for process optimization.Also important is the ability to debug a processbefore, during and after execution. When a work-flow exception occurs, the system and processprogrammers need to have access to the executioncontext at the time it was triggered. Rollback ofprocess state and playback of sequences of eventsaid in diagnosing what went wrong in the process.

In addition, end-users should have their ownset of tools and services. End-user workspacecustomization allows conceptual rearranging oftasks to fit the work context. Also, traditional toolsPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

140

shared across workflow domains such as email,planning, scheduling, calendaring, event notifi-cation and interest registration should all be madeavailable to the end-user and presented in a waythat is easy to apply to all aspects of their work.

3.3.2. SemanticsWorkflow descriptions should be based on asemantically rich modeling language. This languageshould be precise, consistent, expressive and exten-sible (Osterweil and Sutton 1996). The languageshould be expressible enough to represent steps,loops, branches, conditionals, routing, assignment,timing, scheduling and constraints. The semanticsof the system should also allow definition ofcontrol, data and resource flow as well as definitionof policies on how to handle interruptions to them.Overall, the modeling language should be ableto completely and accurately describe non-trivialworkflow processes and support refining thosedefinitions to be as precise as the work requiresover time. While some workflow systems allowprocesses to be designed using a visual modelinglanguage, it is important that there is a strongsemantic description underneath the graphicalrepresentation. This is necessary because whilegraphical views of the process are easier tounderstand, they do not always convey the pre-cision and details of a complex process whichmay cause ambiguity. Confusion between processstakeholders that may result from the differinginterpretations of the graphical representation cre-ates discrepancy between what is intended andwhat is actually done. The underlying semanticmodel should be sufficiently rich in its workflowdescription to allow queries, either program-matically or human-queried, to resolve these typeof interpretation problems.

3.3.3. Meta-ProcessMeta-processes are the means and mechanismswhich govern how a workflow process can change.This change can result from several directionsincluding insertion of new technology into anorganization, feedback from process participants,or competitive market pressures, to name a few.Meta-processes are processes themselves and maydescribe the steps and constraints of how todevelop, evolve, upgrade, install, uninstall, distrib-ute, deploy, integrate tools into, or customize aprocess (Balzer 1994, Bandinelli et al. 1993a, Estubl-

Page 17: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

ier 1996, Tolone 1994). The end product of a meta-process is an executable workflow process modelfor some specific domain or work context. Meta-processes are subject to the same requirements asthe processes that are modeled with the workflowsystem, i.e. the meta-processes should also bedecoupled from the workflow system. This allowsmeta-processes to evolve over time similar todomain specific workflow processes.

3.3.4. Ad HocAd hoc workflow systems stem out of the CSCWfield. In this type of system, users are allowed toperform their work in an ‘ad hoc’ way wheresolutions and goals are accomplished with what-ever tools and data are immediately available,usually from an end-user’s browser or desktop.The primary purpose of ad hoc systems is to informusers of the current problems, issues and dataneeded to accomplish their particular tasks. Thisapproach is appealing because it mimics the fast,furious and unstructured pace of some workactivities. While ad hoc workflows provide minimalguidance and automation support, they are usefulbecause the technology does not inhibit real workfrom being done. A workflow system should allowtasks to progress at their natural pace whilemaintaining a model of the work being performed.It is also important to carefully keep track of thestatus of ad hoc workflows because data and controldependencies as well as assignments are often notimmediately viewable within an ad hoc system.The system should allow for dissemination ofinformation to the widest possible boundaries ofthe organization to encourage ad hoc tasks, but atthe same time take care to ensure that proprietaryinformation and products are kept proprietary.Clear and concise interfaces to workflow taskseases the access management problem.

3.3.5. SimulationSimilar to how code walkthroughs are used toreview pre-execution behavior of software systemsto ensure quality, the same benefits are gainedfrom performing process walkthroughs. This mayinvolve mock-ups, hand execution of the workflowprocess, or in cases where the specification is toolarge or complex, simulation of the functioningsystem. Simulation allows the process stakeholdersto perform ‘what if’ scenarios to ensure chancesthat undesirable side-effects after the workflow isPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

141

deployed are kept to a minimum (Gruhn and Wolf1995). Simulation capabilities of the workflowsystem should allow seamless integration with theactual execution environment merely by switchingexecution contexts. Simulation should support mul-tiple executions across a data input set in additionto playback, capture and rewind. ‘What if’ or ‘isthis possible with these constraints’ are questionsthat should be answerable by simulation results.Finally, any simulation should include extensivetracking, audit, analysis and evaluation capabilitiesto help determine if one set of outcomes or changesis more desirable than another.

3.3.6. RolesRoles define the expected behavior of a processparticipant as determined by their responsibilities.Roles at a minimum should control access to datathrough a user-model that includes login andpassword permissions. Ability to read, write,change, or update data or even processes shouldbe role-specific. In addition, execution of tools andprocesses should be reserved for clearly definedroles. Roles should be flexible enough to have amany-to-many mapping to process participants.This allows role sharing, delegation, intersection,subjugation and re-assignment even during theexecution of a workflow process. At their core,roles should be people-centric and present theparticular participant with the data and toolsnecessary to accomplish their work from theperspective of their role. Definition of roles alongmultiple dimensions, e.g. technical versus non-technical, allows role definitions to share organiza-tional aspects. Questions of ‘whoowns/designs/implements/executes the process?’and ‘who are the people involved?’ should be easyto answer and define.

3.3.7. Multi-Tiered ArchitectureMost current workflow systems assume that itssoftware architecture is static in the sense that itdoes not evolve during execution. In addition,many systems also impose the restriction that thearchitecture of a workflow system does not changeafter the workflow is deployed into its executionenvironment and any changes require re-deploy-ment of the updated workflow and infrastructure.Implementation of the workflow system in ahierarchical manner aids in making the systemflexible and extensible by minimizing the amount

Page 18: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

of software infrastructure that needs to change(Bolcer and Taylor 1996). A good approach is toabstract at the core workflow functions most closelytied to network and operating system services andthen add a series of thin architectural layers, eachincrementally closer in capability to the workflowdomain to which it will be deployed. Each architec-tural layer should have limited visibility, i.e.subsequent layers of the architecture should act asvirtual machines enforcing a clear set of APIs toits services at a common level of abstraction. Thismulti-tiered approach allows minor changes in theconfiguration of the workflow system architectureto support major policy changes with respect todistribution, ownership, or presentation of pro-cesses and data. In addition, impact caused bychanges to the workflow system even while runningcan be minimized with a layered architectureapproach.

3.3.8. Security and Access ControlThe problem of security and access control isreceiving a lot of attention with the popularizationof networked software including the Internet andthe WWW. The purpose of these mechanismsis to provide assurances to participants of theconfidentiality, integrity and authentication of theusers, services and data involved in a workflowprocess. Most security problems are not inherentto the workflow protocols or infrastructure itself,but with the policies surrounding its use. Securityand access control mechanisms in a workflowsystem must accommodate variations in site poli-cies including those due to external restrictions.This may include tradeoffs between performance,usability and security. Workflow system designersshould find a balance between these requirementsbased on the sensitivity of the data and participants.Also, the level of security should be easily configu-rable in the face of changing security and accesscontrol requirements. Major policy changes, suchas relaxing restrictions to previously sensitive dataor tightening restrictions based on inappropriateaccess, as well the minor change of an employeeswitching job responsibilities, should all be easilyconfigurable with the workflow system or on aprocess-by-process basis, even after deployment.

System programmers have multiple mechanismsfor addressing security and access control. Tun-neling involves encrypting and decrypting thecommunication stream between two secure pointsPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

142

which is useful for process participants at remotesites participating in a workflow process throughthe open Internet. Digital signatures can be usedto ensure that the data or messages being sentare not intentionally or accidentally corrupted,guaranteeing the integrity at both ends of communi-cation. Digital watermarks can guarantee unique-ness in order to better control artifacts. Signedapplets are small pieces of executable software thatcan be configured to have certain permissions forchanging or updating local data and configurations.Certificates may ensure the integrity of the softwarepreventing unwanted access to data and can becontrolled by the group or organization internallyor externally to verify participating users or agents.A sandbox approach limits access to certain hosts,domains, times to data and services, but withinthese limits users and agents are allowed a widelatitude for access to information. Quarantineprocedures allow participants to use new softwareor perform new actions after a period of time andtrial should be based on an explicit level of trustand should be incorporated into the workflowevolution meta-process. Similar to how the softwareand architecture of the workflow system can beconfigurable with respect to security and accesscontrol issues, users and their actions can also bemanaged in this way using these authenticationschemes. This information can be used simply forauditing and tracking or to actually protect theintegrity of the information. Firewalls and proxiescan filter actions by users or prevent the dissemi-nation of restricted or sensitive information. Systemprogrammers should be able to incrementally addor remove security mechanisms as the context dic-tates.

3.3.9. Fault ToleranceA workflow system is considered fault tolerant ifminimal interruptions to work occur in the face ofinconsistent or corrupted data and missing tools orresources. The strategy for ensuring the appropriatelevel of fault tolerance in a workflow system is afunction of the consistency and coherence of thedata. Workflow process execution takes a bipolarapproach to managing data. One technique tightlyconstrains the actions that can be performed by aprocess stakeholder or software agent but guaran-tees the internal data consistency of the workflowmodel and execution data. The other allows stake-holders to perform arbitrary actions to accomplish

Page 19: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

their activities but places minimal constraints onthe structure of the intermediate and final dataartifacts. Fault tolerance requirements are verydifferent for each system. In the first, data consist-ency is guaranteed because the actions that aprocess stakeholder can perform are limited to onlythose preserving consistent data transformations. Inthese types of systems, transactions are supportedand allow rollback of workflow processes toprevious states. This happens most when theworkflow model becomes out of sync with the realactivities taking place, often the result of anunanticipated or unpredictable event. In the secondapproach, stakeholders may perform arbitraryactions toward accomplishing their activities, butminimal constraints are enforced on the structureof the intermediate and final data artifacts. The adhoc nature of this approach may cause the dataand artifacts being created and shared to beinconsistent. This inconsistency can be managedautomatically by a constraint system, and failingthat, a process participant. All automatic resolutionsshould be made explicit so that the processparticipants do not get side-swiped during mid-stream execution by the system dumping anirreconcilable set of data constraints. The key is toallow the workflow system and its participants tomanage inconsistent as well as consistent workingdata to minimize the amount of effort that isneeded to get the workflow back on track due tounpredictable occurrences.

3.3.10. Configuration ManagementConfiguration management should be pervasive,but decoupled, in a workflow system (Estublier1996). Not only do the workflow processes them-selves need to be versioned and kept underconfiguration management, the intermediate andfinal data artifacts, such as documents, data, processexecution state and the system architecture, shouldbe. Any component of the system that is relevantto the advancement of the workflow that willchange over time should support checkpointingand versioning. Collections of artifacts, includingthe workflow system’s architectural components,workspace views and execution context informationshould be managed similarly to individual compo-nents. Depending upon how volatile the workflowprocess is, checkpoints can be made at timedintervals or at the point of significant events. Forinstance, a document may change versions onPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

143

write events or on the handoff of ownership.Likewise, in a software development process, thesource code could be checkpointed at the end ofeach working day to allow an automated nightlybuild. Recovery capabilities should be comp-lemented with auditing and access control so thatprogress can be traced or context can be recreated,if necessary. This information is also useful forsimulation and analysis leading to process improve-ment. An important aspect of a complex, distrib-uted, multi-person workflow process is figuringout what happened as well as keeping track of whatis currently happening. Inconsistent configurationsand versions should be detectable by a constraintsystem. The ability to store and access previousactivities and artifacts, retrieve their history, orannotate revisions with comments about thechanges are all important features. In addition,storing this information using a full-featured con-figuration management system allows comparisonsto be made between versions. This is especiallyuseful when tracking and comparing workflowchanges over time. Because process execution andevolution are seldom monotonic, participants mayrequire certain versions of activities to be performedalong with the relevant tool configurations andarchitectural components. As with other policies,the workflow system should be decoupled fromand not imbed an implicit configuration manage-ment and versioning policy in order to allow forflexible changes.

3.3.11. TransactionsTransactions allow the workflow system to managechange during the execution of a process followingan enforced set of policies governing that change.A transaction in a workflow system should be ableto lock a collection of resources in order toatomically complete an activity (Alonso 1997,Heineman and Kaiser 1996). Complementary toconfiguration management, transactions shouldprovide a flexible set of change policies that canbe dynamically enforced during process execution.Transactions may include timing constraints suchas an action being taken as the result of a timeoutof service or data constraints such as ‘insufficientinformation’ or ‘result not produced yet’ warningmessage. In either case, transactions ensure thestep-by-step integrity of the workflow executionstate by completing actions as required or rollingback the workflow system to a meaningful state if

Page 20: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

the action cannot be accomplished without violatingdata constraint or consistency principles. Trans-actions may include privileges such as grant andrevoke as well as the procedural or declarativeknowledge of how and when to commit a validtransaction. Transactions may also describe triggerswhich precipitate some additional action on thesuccessful completion of another transaction.

3.3.12. AutomationWorkflow automation may include automatedhandoff and forwarding of documents, reportgeneration, software installation, notification ofsignificant events, tool invocation, configuration ofan end-user’s workspace, and most importantlythe execution and completion of tasks independentof the workflow participants. The extent to whichthese activities can be accomplished without humanintervention is the level of automation that thesystem supports. A workflow system’s automationinfrastructure should include configurable supportfor both human as well as non-human involvedparticipation such as intelligent agents. Scripting,including capture and playback of user interfacemanipulation, aids in the transition from humanexecuted tool interaction to machine controlled.Scheduling can be used to initiate and carry outdaily, weekly or recurring tasks. The more aworkflow process is executed, the more familiar itbecomes. Processes that have become fixed, tedious,common or procedural are excellent candidates forautomation. The workflow system should allowan incremental path for human-executed activitiesto computer-executed. Automation may further theexecution of a process proactively or reactivelyby executing process fragments or sending outinformation to fillip both work and participantsand respond to changes, completions or postingsby them.

3.3.13. ConcurrencyConcurrency should occur at both the architecturallevel as well as the process description andpresentation level. Concurrency at the architecturallevel allows components to execute or change stateindependent of the other parts of the system(Barghouti et al. 1990). This may allow a greaterdegree of overall system efficiency, increased scal-ability and evolution, and a greater degree ofdynamic change. A high degree of concurrency ina process system’s architecture should be matchedPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

144

with a lightweight, multi-threaded componentmodel. This allows small, incremental changes totake place with minimal change impact on the restof the system, even during runtime. In addition,concurrent components allow the overall systemto be more easily customized by instituting differentcontrol and execution policies governing the sched-uling, behavior and constraints of the components.

Many of the same benefits from software andhardware concurrency and parallelization are appli-cable to scheduling human-executed activitieswhere concurrent events are commonplace. Dayto day work often contains concepts such aslookahead, simultaneity, data parallelism, resourcesharing, overlapping, multiplicity, replication, timesharing, resource sharing and distributed executionat multiple levels. Because of these concepts, onlineworkflow system have the potential to realizeefficiency benefits that are not easily effected in anoffline setting. When modeling a workflow process,concurrency should be explicitly presented inthe majority of cases. When confronted with aseemingly serial list of activities, process executorsmay simply assume that no concurrency is possible,which may reduce the efficiency of the process. Ina large and complex process, potential concurrencesand serializations might not be pursued withoutexplicit guidance. There are many factors affectingthe combination of human and computer executedactivities in a workflow system. Multi-tasking ormulti-threading capabilities of both human andsoftware agents should be exploited in a processexecution environment as long as it does notincrease the complexity of the task or the overheadof communication and coordination. The indepen-dence and interdependence of the activities athand is a good measure for determining whetheractivities should take place in parallel. Without anexplicit presentation of concurrency, it is difficultto determine the efficiency of the process.

3.3.14. Abstraction/EscalationThe ability to employ simplifying assumptionsand abstractions in the modeling, scheduling andexecution of workflow processes is necessary toallow it to scale to a large, complex series ofactivities involving the coordination of many peopleover a long period of time. Not all of the sameissues will apply to the modeling and executionof a process across all levels of abstraction becauseof organizational and multiple-stakeholder issues.

Page 21: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

In a workplace setting, abstraction centers aroundcommon process goals, and escalation follows acorporate chain of command. Because of theseissues, it may be necessary to compartmentalizeaspects of a process through information hiding andabstraction. As an activity or process progresses, theprioritization of the tasks evolves. This evolutionmay shift the encapsulation boundaries dependingon how the changes are handled. Anatomizationof specific tasks should support delegation, rerout-ing, or reassignment in addition to prismaticseparation of tasks across social boundaries. Con-versely, combining several tasks into a largerworkflow with possibly competing goals at differ-ent levels of abstraction should be done with thestrictest of encapsulation boundaries to aid in thesuccess of its execution. Technologies such ashyperlinks and mobile process or agent techno-logies can aid in the scoping and context of aworkflow that is specified across multiple levelsof abstraction.

3.4. External Tool Integration

Tool integration is crucial to the execution ofa workflow process. Workflow systems shouldspecialize in delivering the right information tothe right person at the right time. Once a personhas the data artifacts at hand, the tools to produce,consume, manipulate, translate, annotate or aug-ment should be easily within reach as well. Whilethe goal of a workflow process is to appearseamlessly integrated with the end-user’s desktopand tools, the implementation to do so maynot be easily accomplished. Tool and workflowintegration may require multi-directional com-munication where events or messages can beinitiated by the user, tool, or workflow system.Multi-synchronous communication may berequired in external tool integration by allowingnotification and synchronization to be additive andindependent of the tools, agents and participantsinvolved. Not all tools were designed to beintegrated in this way because of constraints ofthe implementation language or platform. Becauseof this, tool communication should be supportedby multiple protocols, across platforms, and inmultiple languages if possible (Anderson et al.1994, Barghouti and Estublier 1997, Fernstrom andOhlsson 1991, Hubert 1991, Kadia 1992, Krishnaku-mar and Sheth 1995, Valetto and Kaiser 1995).Published in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

145

Because workflow systems inherently supportcommunication and collaboration, it follows thatintegration with communication and collaborationtools such as online conferencing, email, sharedworkspaces, synchronized browsing, collaborativeauthoring, calendaring and other computer sup-ported cooperative work approaches should beeasily accessible from within a workflow process.Going the other way, capturing and measuringhow participants use desktop tools to accomplishwork may form a basis for improving the efficiencyof workflow processes or discovering new ones.Instrumenting online and offline work with metricsand measurement tools to capture interestingevents helps accomplish this. Also, maintaininga strong distinction between the processes thatparticipants follow and the data they create allowsthe greatest flexibility in the evolution of themover time.

3.4.1. Multi-DirectionalInter-tool communication needs to be multi-direc-tional. In order to keep its internal workflow modelconsistent, a workflow system needs to be notifiedof work initiated or completed by external tools.Also, collections of tools may require notificationand forwarding of appropriate artifacts at thecorrect points in the workflow process in order tocontrol their execution and use by participants.This notification can occur serially or in parallel.Serial communication usually occurs in a point-to-point manner. Messages are sent from a sender toreceiver, sometimes in a fixed interval. Serialconnections should have the ability to be initiatedby either the external tool or the workflow system.While serial communication with external toolshelps guarantee the data or workflow modelconsistency by enforcing a partial ordering onthe communication, complex workflows involvingintegration of multiple tools and coordination ofmany people across several levels of abstractionor organizational boundaries may depend onparallel communication. A component, tool, orperson may be acting as both sender and receiverat several points in time during a workflowexecution. In addition, multicasting of data andartifacts and multi-tuning based on interest alongdifferent data distribution channels, whether bythe workflow system or the external tools, allowsfor a more efficient distribution of information.Because it allows greater scalability and evolution,

Page 22: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

this is an elegant way to model communication ofdata and artifacts among a set of cooperatingpeople and agents, but it requires a higher degreeof complexity in the infrastructure to handlecontention and fairness issues.

3.4.2. Multi-SynchronousExecution by external tools and execution ofa process by the workflow system may occurindependently of each other. The workflow enginemay suspend execution pending completion ofwork or handoff of artifacts from external toolsin a synchronous fashion, or it can continuedownstream effecting the handoff and control ofthe process as work is completed asynchronously.Process fragments that have inter-dependencies,such as artifact handoff or resource contention,should be specifiable with synchronization andserialization primitives in the workflow modelinglanguage. Otherwise, independent fragmentsshould be able to be scheduled and executedindependently and in parallel when possible. Asyn-chronous execution should not be overconstrained.This allows work to continue without unnecessarydependencies. Also, asynchronous executionbetween the workflow system, external tools, oreven process fragments allows greater flexibility inreusing and scheduling processes because differentmodels of execution can be used in completing theprocesses. The workflow system should be ableto support both synchronous and asynchronouscoordination of workflow processes.

3.4.3. Multiple LanguagesWorkflow process support environments mustintegrate and interoperate with diverse tools, plat-forms and environments. Flexible mechanisms suchas integration with multiple languages provide theability to evolve the workflow system in concertwith its deployment environment and changingwork context (Maybee et al. 1996). The ideal work-flow system affords workers transparent integrationwith the appropriate external systems as needed,desired, or dictated by the work culture. Workflowactivities may need to integrate with externalsystems either through application programmaticinterfaces (APIs) or through intermediate forms,depending upon the ‘openness’ of the system.Because of the diversity of work performed andthe stakeholders that may be involved, systemintegrations need to occur in the language bestPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

146

suited to accomplish the integration. To handle this,multiple scripting and programming languages canbe used to coordinate between the workflow systemand external tools as long as calls to the workflowsystem and the external objects, tools, and servicescan both be accommodated. In addition, intermedi-ate forms that embody the process or integrationdescription are useful for integrating tools acrossimplementation languages. A helpful strategy forthis is to allow easily embedable components orapplets for parsing the exchange formats such asplain text, XML, or other interchange formats.

3.4.4. Cross-PlatformWork contexts and environments not only involvediverse stakeholders and tools, they include mul-tiple computing platforms. A flexible workflowsystem, in order to successfully deploy and executea process, may need process fragments and softwarearchitectural components to run across multipleoperating systems and computing platforms due tothe availability of resources and tools. Cooperation,control and coordination in a workflow processdo not always happen within the sphere of astandardized or shared technology infrastructure.Processes that lack the ability to run across plat-forms risk inhibiting the effectiveness of puttingthe right information at the right place in a timelymanner. Processes that go through a technicaldisconnect because of cross-platform issues riskderailing the efficient passing of artifacts by placingan extra burden of format translation or tooliconsistency resolution on participants which istangential to the flow of work. Further, controland coordination policies may be limited bytechnology issues rather than the workflow needsof the stakeholders.

3.4.5. Metrics and MeasurementMetrics and measurement provide a foundation forthe workflow infrastructure for better managementand feedback. Both runtime and static metricsare useful for determining the efficiency andeffectiveness of a workflow process. Collectiontools can be used to discover new processesor provide qualitative measurements for processcomparison and optimization. Depending on thedegree of automation, collecting and analyzingmetrics may allow the basis for self-correction orself-optimization dependent upon the amount ofdynamic change and reflexivity the workflow

Page 23: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

system supports. Metrics and measurement toolscan be used to accomplish automated tracking andmonitoring to ensure the smooth execution of largeand complex processes which would otherwise bedifficult for a human to monitor. Metrics alsoprovide a basis for model-checking to determineconsistency and completeness as well as evaluatingthe throughput or change impact either throughactual deployment or simulation. Finally, metricsand measurement infrastructure aids in account-ability by maintaining information on audit trailsor access logs as well as versioning and recoveryinformation.

3.4.6. Process Data and ProgramsAn explicit manipulatable process model is crucialto providing an accurate representation of work,determining status and enforcing control policies.Separation of the workflow process model fromthe data such as artifact and resource models allowsevolutionary changes to the process independent ofthe products being worked on. Likewise, the datacan be changed with minimal impact on theprocess’s control, resource and data flow specifi-cations. By maintaining a clear abstraction betweenprocess data and programs, reuse and customiz-ation are facilitated (Heimbigner 1992, 1996).Depending on the needs of the process andthe sophistication of the workflow infrastructure,separation of the process model’s behaviors fromits state can allow dynamic customization, con-figuration and contextualization to fit more easilythe work environment in which the process isbeing deployed.

3.4.7. Communication and Collaboration ToolsExecution of a workflow process involving manypeople inherently requires managing the associ-ation of individuals with varying degrees ofcontact and interactions in these groups. Activitiesperformed automatically or by the process parti-cipants occur over both time and space as well assynchronously and asynchronously. Completion ofthese activities often requires both formal andinformal communication channels beyond the scopeof the workflow process specification. These chan-nels of dialog can be encouraged via the integrationof communication and collaboration tools. Inte-gration with calendaring, scheduling, shared bull-etin boards and email provide a strong basis formaintaining the workflow history and managingPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

147

its completion. Computer supported cooperativework tools, such as chat tools, distributed authoringand versioning, shared editing, event notificationor messaging and registration, lower the barrierfor participants to reach agreement on issues or toquickly communicate rationale and reasoning. Also,allowing participants the ability to address out-standing issues with communication and collabor-ation tools aids in averting and resolving potentiallywork stopping issues. The integration of thesetools into a workflow environment works bothways. As an example, being able to embed anexecutable workflow process into an email message,a WWW page, or a bulletin board thread providesa powerful mechanism for actively guiding aparticipant to an activity’s completion, possiblyavoiding unnecessary work. The addition of astructured work description that a workflow systemcan provide allows a clearer specification of thework to be completed and individual responsi-bilities. This removes some of the risk of overlap-ping or working towards cross-purposes.

3.4.8. EventsEvents involve asynchronous notification of poss-ibly distributed components at some instant oftime. Events can be broadcast to interested partiesbased on filtering and registration. Workflowsystems built on an event-based model are flexibleand reconfigurable because they allow exchange ofcomponents and re-routing of events to alternativedestinations possibly over the course of the work-flow execution. Workflow systems based on anevent broadcast model assume a certain style ofintegration and interoperation. Work environmenttools and process fragments can be loosely coupledallowing coordination between heterogeneouscomponents using this approach (Barrett et al. 1996,Ben-Shaul and Ifergan 1997, Geppert et al. 1996,Taylor et al. 1966). Also, event-based integrationencourages cooperation between stakeholders asevents are lightweight and notification takes placeasynchronously allowing work to coincide morefrequently. However, the workflow system andprocess execution should be designed such thatprocess participants are not overwhelmed or both-ered by unnecessary information. Software andprocess components as well as people should beable to register their level of interest or latencyof response to filter events appropriately. Eventnotification can be implemented using push techno-

Page 24: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

logies that send events as they are generated or apull model where a participant is required toactively seek the information. Both the push andpull models can also be scheduled to send orreceive events at regular intervals. Specificationfor handling publish/subscribe and local/globalevents should also be definable within the workflowsystem. It is important for a workflow systemto support dynamic change of notification andbroadcast policies, especially with respect to interestand filtering. Finally, because workflow processexecution may occur across many platforms usingmultiple technologies, multi-protocol support aidsin its adoption and usage. Support for email,WWW and various Internet and messaging proto-cols allows events to be broadcast over existingnetwork infrastructures. Integration of paging,phone, radio, wireless and other communicationmessaging tools to send events is helpful forcapturing offline work. Multi-protocol support forevents increases the ability to track, trace, observeand capture significant occurrences in a work-flow process.

3.5. Customization and Reuse

Customization and reuse amortizes the cost ofdevelopment and deployment of a workflow pro-cess over time and changing work contexts. Oneof the canons for reuse in software is that if asoftware application is useful, it will be used inother work contexts. The same principle canbe applied to user interface components, objectlibraries and even workflow processes. Workflowprocesses and their execution infrastructure shouldbe flexible enough to be re-targeted to new workenvironments and easily reused with differenthardware, software and social constraints. Becauseworkflow processes and their execution differ sogreatly between organizations and groups becauseof technical and social factors, adoption into a newcontext may require some ‘glue code’. Programm-ability of the system is required to allow bettermatching between the ‘off the shelf’ process andthe expectations of the stakeholders. Also, becausea priori knowledge of a workflow’s evolutionrequires prescience beyond our current pre-dictability and understanding of most processes,the workflow system and the process whichincludes the views and the execution model shouldboth be extensible (Bandinelli 1995, Bandinelli et al.

Published in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

148

1993b, Fuggetta 1996, Malone et al. 1995, Suchman1987, Sutton and Osterweil 1997, Whitehead et al.1995, Young 1994).

Because control and coordination policies vastlydiffer among groups, support versus dictate issuesshould be configurable and consistent with thework culture. Work policies should be customizableto allow the workflow system to adapt to thegroup or organization rather than the other wayaround. User interactions with the system, toolintegration and environment should also be cus-tomizable. Management of data, artifacts, resourcesand processes should include sorting and searchingto allow users to find the best solution match forthe problem at hand. This may include partialsolutions where stakeholders only use the fragmentof the workflow that is relevant to their task.

3.5.1. FlexibilityFlexibility of a workflow system is evident in theway workflow processes are presented, used andexecuted in the face of changing real worldactivities. A workflow system should supportdynamic creation, deletion and update of viewsinto the process including different views fordifferent stakeholders based on their interest andexpertise. The workflow modeling and executioninfrastructure should be flexible enough thatgraphical representations can be exchanged oreven removed without interrupting the underlyingprocess model. The system will need to adapt tothe target environment rather than the other wayaround. Flexibility in specifying the executionmodel as well as the initiation or completionpolicies should be easily changeable within thesystem. Tool availabilities and integrations shouldalso be configurable to allow on-the-fly or on-demand access in addition to prespecified toolsets. Resource notification should also be flexibleallowing email, scheduling including time con-straints, feedback, announcement and agenda man-agement for people and other process resources.Artifact routing and synchronization should sup-port a wide range of policies, even during thecourse of a workflow execution to allow foralternative processes and workarounds to occur.Support for concurrent access to data as well asconcurrent execution of workflow process frag-ments allows greater flexibility in sharing artifactsand encouraging cooperation between steps of aprocess. As with any best laid plans, exceptions

Page 25: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

will always occur. The workflow system shouldallow for multiple paths to completion and excep-tions such that unforeseen obstacles and limitationscan be easily hurdled without invalidating theworkflow process model or its planned execution.

3.5.2. ProgrammabilityWorkflow processes should be programmable inmultiple ways. A workflow system should allowworkflows to be programmed both visually andtextually depending on the abstractions and levelof expertise of the user. Visual workflow program-ming languages allow the process programmer orend-user to view complex interactions more easilyand arrive at an acceptable workflow specificationbecause object interrelationships or transformationsare sketched out and concrete. Despite the advan-tages of visual programmability of workflow pro-cesses, they sometimes lack the power andexpressibility to succinctly describe a complexseries of actions and require textual codification.Open APIs for each architectural component of thesystem allow for greater flexibility, customizationand reuse by supporting access to only the dataand functionality that is relevant to the workcontext at hand. These APIs should be supportedat multiple levels of abstraction to allow both fine-grained customizations as well as high levelchanges. Support for an open API across multiplelanguages also improves coordination of diversetools and components. Support for all of theseapproaches aids in the reuse of work whenadopting, specifying and executing off-the-shelfprocesses.

3.5.3. ExtensibilitySuccessful workflow processes, like successfulsoftware, will need to be extensible to fit into anew usage context or environment. This newcontext may be a use of the workflow process thatthe original process designer had not anticipated,such as a new domain. This may include embracingnew domain models, enabling domain specificpresentations, subclassing and overriding datafields and methods possibly even during theexecution of the process. The process model shouldallow functionality to be incrementally addedin such as new process execution primitives,alternative user interaction models, and the setof desktop tools and components used in thecompletion of process tasks. A strong separation ofPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

149

concerns, possibly through event-based integration,allows minimal impact on pre-existing componentswhen extending the system. Clear architecturallayers with open APIs at each layer aids in reducingprocess rework from extending existing processobjects and fragments. The workflow infrastructureshould be extensible also. Providing support in theinfrastructure to dynamically change and integratenew components even after deployment helpsensure the usefulness of the system in the face ofchanging work requirements.

3.5.4. Support/DictateA workflow system must address the tradeoffbetween supporting a user activity versus dictatingthe activity to them. The question of how closelya process must be followed and in what order thesteps must be taken is largely a secondary effectof the work culture in which the process is beingexecuted. The less leeway a worker has to improvisebecause of quality assurance, work standards orartifact dependencies, the more the workflowsystem dictates the worker’s activities. Processesthat require strict adherence should support dictat-ing the activities, however, over-enforcement of aprocess may result in an unsolvable path to asolution. At these times the process participantwill go outside of the system to overcome thelimitations of the workflow infrastructure whichmay have been caused by insufficient customizationto the problem at hand or an exception which thesystem was not able to recognize or handle. Eachwork environment has its own requirements foraccomplishing workflow processes, and the work-flow system should be flexible enough to allow itto adapt to the changing needs of the organization.Historically, workflow and process systems havesuffered from over-constraining the work beingdescribed, so care should be taken to err on theside of support when possible.

3.5.5. CustomizationCustomization of the workflow system is key toits usefulness when adopting processes in a newwork environment by reducing startup costs andincreasing the applicability. Customization mayneed to take place across all levels of the systemincluding execution and notification policies, objectlocations and even persistency and update policies.Customizations should be separated into differentlevels by abstracting away or hiding the technical

Page 26: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

details for stakeholders that do not need them inorder to complete their tasks. It follows commonsense that a non-technical end-user will not wantto customize the execution model but may wantto change the graphical presentation to allow theinteraction to be more consistent with other toolsin their workspace. Further, supporting individualcustomizations, such as allowing end-users todecide whether to be notified by email or byupdating their agenda, increases the flexibilityand chances for success in the deployment andexecution of the workflow process. Customizationsshould be allowed using property sheets, templatesand style sheets, programmatic calls to the system’sAPIs, and integration with rapid application devel-opment and architecture manipulation tools.

3.5.6. Searching/SortingWith the advent of the WWW, searching andsorting has become indispensable to the user’sdesktop. The ability to quickly find information thatis relevant to completing an activity is important toa smooth running workflow process by makingsure the right information is accessible from theright desktop. Users may not only need infor-mation, but may require guidance too. The answerto ‘How do I do this?’ should be derivable bysearching for solutions which may be workflowsthemselves and sorting through their relevance.Searching and sorting can be both proactive andreactive. Proactive searches allow the processparticipant to go out and seek tools and artifactsthey feel are necessary to completing their taskand provide a summary of relevant results ratherthan what is given to them by the process designerin their workspace or on their desktop. Reactivesearching is usually initiated by some constraintviolation or change in data. It is usuallyaccomplished by an automated agent which combsthe information space and returns summaries thatmight be relevant at a particular step in a workflowprocess. Sorting can be made using various keyssuch as date, priority or amount to determine therelevance and quickly generate an overview ofcomplex and distributed data. It is a difficultproblem to manage the visibility of artifacts overtime or distributed over a network. Searching andsorting capabilities integrated with a workflowinfrastructure allow a more decentralized manage-ment model of distributed work by locating andusing artifacts and tools on demand and notPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

150

depending on centralized tracking of all the work-flow components.

3.5.7. Process FragmentsProcesses are made up of relationships betweenmany components such as people, activities, tools,documents etc. Not all components are useful orrelevant in the execution environment of theworkflow process. Process fragments are small setsof relationships between a small number of processobjects in a specification. Process fragments shouldbe lightweight, easily manipulatable, self-containedand usable independent of other parts of theprocess. What makes up a fragment is usuallydetermined by the process programmer or end-user, so the workflow system should allow group-ing and embedding of these processes as well asthe traditional actions of cutting, copying andpasting. Process fragments are useful in that theyare usually a small series of steps that have provenuseful in the application and execution of anotherworkflow, but reused in a new context. Fragmentsalso encourage the cleaner breakdown of workspecification. A small, self-contained process frag-ment might more easily be embedded in an emailor WWW page than the textual description ofthe process and with better interactive guidancecapabilities. Process fragments not only encourageabstraction and reuse, they encourage evolution aswell. As fragments are reused over time, the mostuseful relationships and process patterns maybegin to emerge.

3.6. DistributionTools and networks have co-evolved, advancingthe sophistication of the computer technology andresource availability to the average user. Whilethis technology has greatly advanced, basic coordi-nation and collaboration support has not kept pace.Distributed computing and network technologyhave yet to exploit the full potential productivitybenefits of such widespread cooperation overcommunication networks. Collaboration over thesenetworks to date has largely been limited to sharingof data in common formats and across commoninfrastructures and protocols (Kaiser et al. 1997,1998a, b, Nutt 1996). The policies and interdepen-dencies for sharing, guiding the use of, creating,destroying, manipulating and handing off of thisdata have largely been implicit or unspecified. Thisoften leads to miscommunication especially with

Page 27: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

workflow process participants who interactinfrequently, such as team members at distributedsites. Because the participants may not be withinthe same geographic location, and may haveworking schedules that never overlap, peopleparticipating in a distributed workflow process,such as a virtual workgroup, lack the ability toobserve and anticipate the factors that affectthe interdependencies between tasks. A virtualworkgroup typically lacks the opportunities forinformal conversations, such as those encounteredin a traditional organization during coffee breaksor hallway exchanges, that would provide the‘overall picture’ needed to anticipate coordinationproblems. Likewise, the time cost in asking whateveryone is working on, in order to avoid dupli-cation of effort, may exceed the time taken tosimply perform the task at hand.

The Internet, including both private and semi-private networks such as intranets and extranets, aidsin scaling down the issues of remote access andavailability of both data and resources. Leveragingthese types of network infrastructures allows decentra-lized cooperative work to take place at distributedsites. Various parts of the workflow process can beexecuted at the location that is most appropriateusing the resources available. This overcomes theproblem in which some resources are potentiallynon-distributable because the cost would be prohibi-tive, but the work utilizing the resources can becompleted elsewhere. Each remote process componentor fragment may progress independently of otherprocesses with which it coordinates. Local data andexecution state can be protected or handed off toother remote sites as appropriate. Handoff of artifactsand process control can easily piggyback off thenaming and routing idiom of the Internet and WWWusing Domain Name Services (DNS) and UniformResource Locators (URLs (Berners-Lee et al. 1994))which even the most non-technical user has come tounderstand. Various parts of the workflow process,such as tools, artifacts, specifications, user interfacesand even the infrastructure itself including the meansto change or execute the process, should have built-in network support. This encourages design of highlydistributed processes, even across low-bandwidthand non-PC-based clients.

3.6.1. Intranet/Internet/ExtranetThe Internet is a publicly shareable set of communi-cation networks that cooperate through stan-Published in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

151

dardized protocols and allow global access to anabundance of information. Networked communi-cation is critical to distributed workflow withinan organization that is geographically dispersed,however, issues of proprietary data and accesscontrol limit the types of workflow that can beaccomplished using the Internet at large. Anintranet is a private Internet that limits accessand data to the appropriate stakeholders. Whileintranets allow the sharing of data such as productinformation, client databases, standards of practiceand work guidelines, the information transmittedusually does not contain workflow guidance orautomation capabilities beyond structured text andapplet execution for displaying the data. Evenapplets that may perform a series of complexactivities do not involve coordination with otherautomated agents, activities, workflow processes,or people across time and space. A workflowsystem should allow downloading of a processspecification as well as the means to execute thespecification over the protocols in place in additionto managing the coordination and cooperationaspects of each partial process. The mechanism todo this should be the same for executing a workflowprocess on both an intranet or across the internetwith the exception that extra care should be takento ensure data and task privacy.

Extranets, external intranets, allow remote pro-cess participants to share information behind fire-walls, even between two different company’sinternal intranets, while maintaining security ofthe information. Access is usually through a Website using secure channels to allow a processparticipant access to resources and data previouslyshared only at the host company’s actual work site.Workflow processes should allow communicationacross extranet sites. A good approach is to serializeall intermediate messages using only the protocolsavailable for transporting secure information acrossa public network and those that are allowed tobring data packets into the protected area andthrough firewalls, like HTTP and secure HTTP.Support for digitally signed processes, tools andartifacts, permits occlusion of other group’s ororganization’s proprietary processes throughabstraction and data protection. The ability toprivately tunnel across public communication net-works, a form of secure point-to-point data transferover public lines, provides data, tools and processesto remote users setting up a component of the

Page 28: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

infrastructure needed for execution of network-enabled workflow processes.

3.6.2. Distributed ProcessesDistributed work requires complementary andcooperating distributed processes. Activities areheavily influenced by local affordances such aslocal work context and culture, tool and resourceavailability, and mechanisms for handing off arti-facts and documents. Different activities or processfragments can be coordinated through messagesusing various transport mechanisms and networkprotocols. The threshold for collaboration by remoteprocess participants is reduced accordingly withthe number of communication mechanisms pro-vided and the amount of overhead the user ‘pays’in order to handle communication above andbeyond the work that is required of them. Allowingparticipants to choose which mechanism is mostappropriate for both their local work context andthe shared goals with other users, such as WWWbrowsers, network-aware applications, modemupdates, email, or other forms of telecommuni-cation device interactions, helps avoid potentialmiscommunications. Supporting processes in theworkflow infrastructure that can be easily down-loaded over these communication channels inaddition to the means to execute them stronglyencourages distributed cooperation.

Distributed execution of workflow processes aremore prone to site failure. Failing one mechanismof communication should precipice action to betaken in the form of an alternative means of contactand handoff. It is important that the workflowsystem also incorporates the concept of a criticalpath. Work that ‘falls in the cracks’ or ‘gets lostin the system’ usually occurs between organiza-tional and group boundaries that often involvesocial expanses as well as physical ones. Policiesabout distribution should be flexible and allow fordistribution of solely the data, the process, orboth, depending on the process requirements.Distributing a process generally leads to betterutilization of distributed resources, but extra caremust be taken to ensure the consistency andintegrity when distributing the data.

3.6.3. Local/GlobalA workflow process can involve thousands ofprocess fragments ranging from simple automationtasks to inter-process coordination across differentPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

152

levels of abstraction. Workflow should fit in at theindividual, group and enterprise levels in additionto supporting the ability to assemble smallerprocesses into larger ones. Participants in theseprocesses can view their work as being either localor global (Kaiser and Whitehead 1997). In acooperative work setting, all participants areexpected to emphasize the work of the communityover the individual. However, if a workflowis viewed as only benefiting the global workcommunity and not the individual, the adoptionand execution of the process will not be successful.It is important that the workflow is seen for itsvalue in accomplishing local tasks in the workcontext in addition to not violating global con-straints. Individuals, while accomplishing theirwork, may customize an activity or artifact to bestperform their task at hand. While there may existnumerous local optima, some changes violate theglobal constraints of the workflow. Other activitiesthat relied on the specific format, tool, or processmodel may break because of the local customiza-tions causing the output of one activity to nolonger mesh with the input of another. Local dataand intermediate artifacts may be used, createdand discarded during the course of a participant’swork if they will not be passed along or usedby others outside of the context. Limiting theparticipants’ access to customization methodsensures that global constraints are adhered to butat the cost of reducing the flexibility of thesystem. Integrating constraint management intothe workflow infrastructure helps to maintainconsistent processes, however, the enforcement ofa constraint should only occur at the level ofabstraction at which it is relevant. The impact ofchanging, reusing, integrating, or embedding alocal process into a larger one is minimized whena workflow can be viewed as both local andglobal depending on the participant. Escalation ofworkflow processes to encompass global concernsor delegating them to local work contexts shouldbe supported through resolution and abstractionmechanisms in the workflow system.

3.6.4. Routing and NamingLarge organizations may divide workflow pro-cesses strategically across groups, participants,sites, systems and desktops to take full advantageof the capabilities dispersed throughout the com-pany or even across companies. The problem then

Page 29: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

is twofold: how does a technical or non-technicalworkflow participant identify an activity, artifactand resource and how can they correctly findand handoff information artifacts across networkprotocols? The WWW offers an excellent exampleof global and inter-networked object routing andnamespace management (Berners-Lee 1994, White-head 1997) through its use of universal resourceservices including Identifiers (URI), Locators (URL),Names (URN), Citations (URC), and Agents (URA(Daigle et al. 1995) ). Leveraging the Web’s routingand naming mechanisms for symbolic represen-tations, bindings to their object in a specificcontext, and the explicit set of reference instructionsprovides a comprehensive infrastructure for net-work-based information dissemination. Using therouting and naming mechanisms of the WWWallows participants to leverage their familiaritywith protocols commonly in use across mostdesktops. The WWW provides the flexibility toallocate objects across a department or enterprise-wide network to best fit a workgroup’s needs andintergroup coordination strategies. Adhering to therouting and naming conventions of the Web alsohas the added benefit that other technologies suchas searching, sorting, indexing and validationengines, sometimes called spidering, can be usedto maintain the distributed constraints. URLs allowa non-technical user to specify where a particularquery should be routed and which object is desired.A URN represents an institutional commitmentto a persistent resource and by releasing anartifact as a named reference to a resource,distributed groups can access the latest objectversion with the social contract of availability.This allows workflow participants to incorporateremote services more easily into their localworkflows in order to build up their automatedinfrastructure. URCs are usually name-valuepairs that represent some type of meta-knowl-edge about the activity, artifact, or resource.This allows workflow process rationale andassumptions to be embedded in the resourcesthemselves. Because the routing and namingmechanisms of the Web have become so common-place and the potential for collaboration acrossthese protocols so great, a workflow infrastruc-ture would increase its applicability to a widerrange of workflow problems through support ofits routing and naming conventions.Published in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

153

3.6.5. Built-in NetworkingBuilt-in networking allows flexible and timelyupdating of information between remote workflowparticipants. The meshing of the workflow processinfrastructure, the user’s process execution work-space, and various networking protocols allowsfor individual customization and automation oftheir local tasks while providing dynamic andinvisible mechanisms for communication, for-warding and handoff of messages and artifacts.Providing network connectivity in the workflowinfrastructure and even the tasks themselvesencourages participants to explore alternatives andbuild upon others’ work through shared resourceswithout the overhead of explicitly sending out orposting update announcements (Abowd 1996). Forexample, a workflow system might use Internetemail to send data files between applications thatautomatically read email, extract data, performtasks and generate new email as a response.Traditionally the ‘workflow’ part does not includethe transport of messages which is done by Internetemail, nor the processing of them, which is doneby the individual applications. By merging thefunctionality to provide built-in networking andpermitting the specification of the delivery trans-port and schedule, workflow processes can bemade to cover a wider domain than simply whetheran item is available to be handed off or to whomit is to be delivered.

3.6.6. Low Bandwidth and DisconnectedA major portion of the work that takes place in aworkflow process happens offline. Support forlow bandwidth and disconnected clients helps inlowering the cost of data maintenance and integrityas well as a process’s adoption into a complex anddispersed work setting (Perry 1997, Skopp 1995).Workflow process tools should allow for both highand low bandwidth collaboration and messaging.High bandwidth tools typically involve videoconferencing or real-time remote data visualizationand require unhampered and continuous networkconnections. While these tools are useful in somecontexts, the overall cooperation in a workflowprocess usually is not dependent upon thesesophisticated utilities and support. Network trafficlimitations often make critical path dependence ofthese technologies in a workflow process imprac-tical. A workflow infrastructure that is not onlydesigned to get around network connection limi-

Page 30: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

tations, but embraces a low bandwidth and discon-nected model of work, will be able to map morecoherently to the actual tasks because of itsflexibility. Participants should have the capabilityto detach the process fragments that are necessaryfor the particular tasks without unnecessarilydisrupting the work of others in addition toskillfully re-integrating them when connectionis resumed to the overall work process. Thissynchronization should be supported over standardnetwork access techniques such as modems, infra-red ports, cable, satellite, or normal plug-in networkconnections over common protocols. Assignment,synchronization and completion policies for hand-ing off processes, data and documents as well asother artifacts and resources should be flexibleto allow alternative models of structural andsupervisorial control and coordination. Aspects ofa process including confidentiality of data, trackingand status of remote process fragments, timeout,fallback, handoff and process re-synthesis shouldbe customizable per work context. This allows theindividual participants the flexibility and support-ing information and infrastructure they need whileaddressing the global concerns of the process andresource management.

3.7. Dynamic Change

Dynamic change in a workflow system is charac-terized by continuous change, activity, or progress.Support for dynamic change in a workflow infra-structure may enhance the speed and efficiency ofexecution in the target context, the convenience ofdata and artifact tracking and handoff, or thenumber of choices a participant may make to bestaccomplish tasks including value-added changesmade at the local level (Abbott and Sarin 1994,Cugola et al. 1996). Also, dynamic change allowsre-assignment of execution on-the-fly which pro-vides for greater resource utilization and replace-ment. The ability to dynamically change a process’sdefinition, the set of views into the process, andthe execution model or behaviors at the time it isin progress allows the workflow to better adaptto changing requirements. In a complex, multi-person, ongoing workflow, not all task require-ments or work interruptions can be completelyanticipated. Changes to the workflow or changesin the execution context may cause an exception.It is important for a workflow infrastructure toPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

154

recover gracefully from such occurrences. Thismay involve dynamic substitution of alternativeexecution behaviors, transference of objects tothe appropriate context, or manipulation of theworkflow model either by a participant or by theactivity and exception handlers themselves. Byallowing the workflow model to query and dynami-cally change the activities, artifacts and resources inthe face of unexpected conditions, the infrastructurehas the ability to organize, optimize and moreclosely adapt to the needs and resolve the problemsof the organization.

3.7.1. Contingencies and HandoffEven the best planned workflow will likely encoun-ter exceptions requiring contingencies duringexecution. Contingencies usually involve planningfor the possibility that something may go wrongin addition to the expectation that work willproceed correctly and in synchronization with theworkflow specification. In a complex workflow itis impossible to plan for all contingencies, so agood recovery mechanism should be in place thatpermits a disrupted workflow execution to resume,start from a midpoint, or rollback to a previouspoint in the process using computer or humanconsummation of reset, restart, undo, complete,abort and recover (Finkelstein et al. 1993, Saastamo-inen et al. 1994). Handoff and recovery of artifacts,re-assignment of activities, and delivery of bothshould be supported at the local level if permissible.Flexible exception handling which includes scopingis key to minimizing the detrimental effects ofsuch unplanned occurrences. Local exceptionsshould not halt all of the work but rather bepropagated through a series of handlers until theproblem can be resolved or a workaround such asan alternative path to a solution can be specified.Conversely, a show-stopping exception should bebroadcast to the appropriate participants and agentsto minimize misdirected work or reflect the newestwork priorities in the face of the new constraints.

To facilitate workarounds, handoff of processobjects and workflow specifications across networksto dispersed participants should be easilyaccomplished including context handoff. Clearcommunication channels, unambiguous work andactivity model representations, and common under-standing of what is being handed off, and whichexpectations are being placed upon the recipient,are all requirements for successfully transferring

Page 31: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

work assignments. Handoff of these process objectsusually happens during regular workflowexecution, but when an exception occurs, roles andassignments can become indeterminate. Mismatchof expectations may occur, and important tasksrun the risk of getting lost in the reshuffling ofwork assignments. Sharing of process objects acrosssites may not be enough to guarantee the successfulhandoff and resumption of a process due todifferences in local work contexts. Handshakingmay need to be done during handoff. Online, theconfirmation of items received can be accomplishedusing an electronic docket or invoice to ensure theexistence of items and a digital signature to ensurethe quality and accuracy. Offline, a handshake isstill a handshake between people, but some onlinerecord should be kept to reflect the acceptance,agreement and social obligations between all parti-cipants involved. Because handoffs may requirechanging execution context, it is important forworkflow processes and objects to be able toadapt to the new environment or handle anycontingencies gracefully.

3.7.2. Evolution and OptimizationEvolution of process objects and workflows occursover time as a result of changing tasks, priorities,responsibilities and even people. Optimizationoccurs when an improvement to a previous workmodel results in a better way of doing thingseither by adding, removing, or redefining processactivities and their constraints (Abowd and Schilit1997, Jaccheri and Conradi 1993). As a workflowprocess is repeatedly executed, small changes aremade each time through. Eventually a process willconverge on a common practice and becomeinstitutionalized. Unfortunately, a common practiceand a best practice may not be the same thing. Inorder to determine this, metrics must be keptto evaluate one execution from another. Thisevaluation is subjective because the criteria changethe same way the workflow does. Successfulworkflows, like successful software, will be appliedin new situations that may have been unanticipatedby the original creator, and unsuccessful workflowswill be abandoned or changed. It is important inthe workflow infrastructure to allow the changeto occur both before and after deployment of theworkflow, by both technical and non-technicalparticipants. Some optimizations may even bePublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

155

performed by the system itself through agents,validation tools, or optimizers.

To better support process evolution and optimiz-ation, reuse of process fragments which includesmall sets of specifications and objects shouldbe easily reusable, divisible, understandable andcapable of being evaluated against some measur-able criteria with respect to expected execution oranticipated behavior. Expectation agents and event-monitoring infrastructure are useful for gaugingthe effectiveness of a workflow process. Also,sometimes when a new process is introduced to awork environment and culture, there is somepushback to its adoption. It is important for theprocess to be able to adapt to the environment. Inaddition, process discovery tools are useful forcomparing and validating the workflow modelwith the actual work being accomplished. Widedivergences can indicate technology mismatches,inapplicability, or the need for evolution andoptimization of the process. Reuse of processfragments is key to evolutionary development andoptimization of workflow processes. A fragmentthat is successful in one context has a greaterlikelihood of being successful when tried in another.For instance, in a testing process the participantmay require that the same outcome is reachedover repeated executions. Different outcomes implydifferent qualities of the software. Similarly, aprocess can be used to develop the skill of aparticular participant as in a training domainwhere the end-user’s path through the process isdependent upon their skill. The inculcation resultsin the goal of visiting every activity or series ofactivities through repeated executions andincreased skills. The process may change based onfeedback, usage, level of interaction and number oftimes executed. As with all changing components,change management techniques such as versioncontrol and transactions should be integrated withthe system.

3.7.3. Dynamic BehaviorsThe ability to dynamically change a process defi-nition, the set of views into the process, or theexecution model at the time it is in progress tobetter fit changing requirements, availability ofresources and the applicability to the current workcontext is crucial for workflow process executionand evolution over time (Heimann et al. 1996).While introduction of dynamic change to a process

Page 32: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

and its representation may require additionalinfrastructure to recognize and enforce consistency,limit access to change mechanisms as appropriate,or correct problems that impact other parts of theprocess, the ability to dynamically evolve inconjunction with the work environment, cultureand context is important to keeping the onlinedescription of work consistent with the actualwork being performed. Long-running distributedprocesses involving multiple stakeholders willencounter a multitude of situations demandingchange, escalation and reorganization. Dynamicchange allows these issues to be addressed withminimal impact to the ongoing, executing workflowprocess. Late-binding of resources in a workflowallows completion of activities using the resourcesat hand at the specific point in time the work isactually done. Planning and scheduling compo-nents should complement late-binding to ensurethat the required amount of resources are availableat the appropriate times to complete the tasksat hand.

Handoff of control and data from one person oragent to the next regardless of whether it wasplanned or unplanned will result in context switch-ing. To aid in the resolution of context awarenessand adaptation, it may be necessary to dynamicallychange object fields, methods and behaviors atruntime. In the best case, parameterization canaccomplish this if the changes, handoff and trans-computation of the workflow are predictable. Mostlikely, however, the process activities, artifacts andresources will require some acclimation to thetarget environment. This may involve downloadingof new and existing coordinated views of the workor data models, multiple process interpreters forautomation or coordination, and bi-directionalexchange of dynamic and changing process objectupdates in way of work refinement and taskclarification. A good mechanism for supportingdynamism in process object management is toseparate object data from its behaviors. Objectbehaviors such as messages can be dynamicallyloaded and resolved similar to how a model-view-controller implementation style of graphical userinterfaces allows dynamic addition and removalof presentation views.

3.7.4. ReflexiveA workflow process is reflexive if during executionit has the ability to remodel itself either automati-Published in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

156

cally by an agent or by enlisting the aid of ahuman stakeholder in response to status andfeedback information about how the process isperceived to be progressing. Reflexivity is parti-cularly useful for generative processes where,during the execution of the process, either itselfor another process can be built up for deliveryand execution. A workflow process or infrastructurecomponent should have the ability at any time toconstruct, query and manipulate its own processfragments based on the running state and thetarget environment. Context awareness, i.e. how thecomponent relates to itself, the outside environmentand other components, may allow self-integrationor adaptation to new situations, self-optimizationof scheduling and constraints, and self-organizationof relationships and dependencies (Abowd et al.1997, Bandinelli et al. 1992, Dourish 1995). Thislevel of introspection can be combined with learningrules or business strategies to form an evolutionpath over time. A reflexive workflow can experi-ment with resources at its disposal over multipleprocess iterations to determine the best way toaccomplish individual or small group tasks ormake explicit inter-group dependencies. Workflowfragments and objects should be symmetric. Thismeans that a workflow process should be able tomanipulate its own objects and relationships. Inaddition the objects themselves should be ableto manipulate, query and dynamically constructexisting or new workflows including dynamicand on-demand composition, augmentation,annotation, embedding and actuation. Supportingreflexivity in a workflow infrastructure allowsknowledge about a workflow’s applicability tothe context and the effectiveness of its deploymentto be evaluated over time. Keeping track of achange history in addition to being able to derivechange trees and programmatically form queriesabout them provides a mechanism for re-visitingevolutionary branches. Catalysts for change,whether for better or for worse, are easier toisolate and recognize. Continuous evaluation andadaptation through reflexive understanding ofthe workflow lends itself well to continuous,automated process optimization through self-modification and knowledge building activitiessuch as self-organization and decentralizingorganizational knowledge throughout the parti-cipants and agents.

Page 33: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

3.7.5. Partial ExecutionPartial execution means that the execution of aworkflow process is dynamically composable inaddition to its components and specification. Theeffect of this is that the whole process does nothave to be specified before it starts executing. Thisis also known as ‘just in time’ execution, wherethe definition of the workflow is not created untilit is needed. This is helpful for projects where thedownstream details are intentionally left undefinedbecause they are dependent upon upstream results.While just-in-time techniques for process executionmay create ambiguity and lack the ability to doglobal optimizations across all activities beforeexecution, work specifications are generated on-the-fly and on-demand allowing many local optimi-zations for discriminants other than execution time.Partial execution is analogous to a cat’s cradle, achild’s game in which an intricately looped stringis transferred from the hands of one player to thenext, resulting in a succession of different looppatterns. A process’s execution should be able tobe picked up and executed from any point specifiedincluding the dynamic reorganization of localrelationships and constraints to fit the new workcontext. Partial execution supports multiple iter-ations of a process fragment or multiple alternativeiterations of the same process fragment changingorder, priority, focus, or completion criteria of thefragment. Support for resolving and integratingprocesses via pipe-and-filtering, restringing,rework, and amalgamation of diverse processeswhich include possibly competing priorities, shouldbe included into the workflow execution infrastruc-ture. This may include temporal and spatial contextawareness in addition to the possible executionspace and control, coordination and collaborationpolicies. For individuals or groups, this may includeseveral partial executions in order to repeat untilit is right, i.e. the artifact is good enough to postor handoff to the next participant.

4. ADVANCED WORKFLOW ISSUES

The principles of an advanced workflow systemas enumerated in the previous section are notapplicable across all domains, nor are they mutuallyconsistent. The question is, how are we able to builda workflow design, execution and deploymentinfrastructure that adheres to this collection ofPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

157

non-independent and sometimes contradictoryprinciples? The answer lies somewhere in a system’sflexibility such that it can evolve over time toaddress new issues and be applied to new areas.For the most part, there are very few qualitativemeasurements of what is a good workflow processor how to compare one to another in order tochoose the better of the two. Much less, there isvery little consensus across disciplines on whatthe supporting environment for cooperative andcollaborative work should look like. The conver-gence of these disciplines has unshadowed areaswhere there is no clearly defined best approach.This section looks at some of the tradeoffs, chartsthe prevailing trends and explores some of thedesign impacts.

4.1. Tradeoffs

There are different classes of users, disciplines andproblems. Determining what would be the idealizedenvironment which would be all things to everyoneis a difficult, if not an impossible, task. How peoplework varies so much that to accommodate suchdiversity, tradeoffs need to be addressed andprioritized. The difficulty of defining such anenvironment does not make that goal unachievableand some principles can be borrowed from thedisciplines studying automated and intelligentagents (Hanks et al. 1993, Hyun et al. 1995). Variousapproaches have been taken to identify broadlyapplicable constructs and services that can be usedas a basis for specialized problem domains orapplications through factoring workflow modelinglanguages (Osterweil and Sutton 1996), supportingcomponentization and contextualization in a mobileworkflow execution infrastructure (Abowd et al.1998, Bolcer and Taylor 1996, Griff and Nutt 1997,McCrickard 1997), or controlling change and changemechanisms for data in a database through trans-action management (Heineman and Kaiser 1996,Yang and Kaiser 1998). All these approaches sharea high degree of flexibility with respect to tradeoffsamong various uses of the technologies. Solutionsrequire consideration of these issues (Swenson andIrwin 1995), their approach being far from settled:

I Proactive versus reactive. Systems that effectexecution internally in anticipation or expec-tation of external changes either by humans orother systems are proactive. Proactive systems

Page 34: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

perform checks and evaluations, often continu-ously executing, and assume that determinationof state is an inexpensive operation. Proactivesystems usually have higher assurances thatthe process’s state is internally consistent andits model more accurate with respect to theworld outside the system. Reactive systemsgenerally only perform execution whenexplicitly requested.

I Efficiency versus effectiveness. Efficiency is therelation of output to input: effectiveness is thetotal output. In information theory, a processis both elegant and efficient if no smaller orless costly process can produce the same outputin the same amount of time. While in somecases it may be theoretically impossible todetermine this optimal process, high resourceutilization and broad artifact sharing contributeto its pursuit. An effective process may some-times maintain resources not directly related tooutputs and is only concerned with increasingthe final quantity or quality of results possiblyindependent of inputs, usage, or sharing con-cerns.

I Implicit versus explicit. Concurrency of tasks,distribution of objects, and visibility of work-flows can all be made implicit or explicitdepending on the level of abstraction. Explicitexecution of a workflow requires knowledgeabout the structure and the participant’s aware-ness of where in the process the activity istaking place. Implicit execution implies a focuson the task at hand independent of the pro-gression of the overall process. Explicit rep-resentations allow understanding of the detailsand supports increased precision and unam-biguous control. Implicit structures hide detailsbut encourage modularity and are generally lesssensitive to changes. Implicit process executionrequires more attention as assumptions aremade about the context in which it exists.

I Internalized versus externalized. Using a processto guide a participant through a series ofcomplex steps in order to train them how toaccomplish a task is an example of internaliz-ation. Symmetrically, writing down or modelingjob knowledge such that it can be used byanother is externalization. In addition tohumans, processes can be internalized or exter-nalized in software systems also. Internalizedprocesses assume that the world is its own best

Published in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

158

model and are characterized as a passiveprocess where some knowledge is acquiredthrough unconscious (in humans) or automatic(in agents) mechanisms. Externalized processesmaintain a separate model in the real worldand measures are taken to ensure its applica-bility and consistency.

I Declarative versus procedural. A declarative state-ment is an assertion about what is or shouldbe true or valid; a procedural statement is adescription of how to achieve it. Ideally, if thecontrol information for a workflow was general,flexible and adaptable enough, the workflowcould be specified completely with declarative,goal-based imperatives. Because human-executed activities are highly dynamic andrequire flexible exception handling(Saastamoinen et al. 1994), it is difficult andsometimes impossible to completely specify aprocess in this way. Procedural details for anyreasonably complex workflow process maybe required for efficiency, policy, quality, oraccountability concerns. Also in terms of rep-resentability, some workflows are more easilydescribed in procedural rather than logicalstatements. While the two are not irreconcilable(Craig 1997), they represent different para-digms.

I Guidance versus enforcement. Lack of adequateguiding principles to perform a task will resultin wasted effort or misdirected goals. When aworkflow is meant to inform or support,participants seeking guidance willingly give upcontrol to the system by allowing their freedomof choices to be limited to the prescriptedsequence. Systems that dictate or automate,however, tend to clash with expert or knowledg-able participants leading to the failure of aworkflow’s adoption in that context. Enforce-ment limits the actions that a participant maychoose in order to maintain strict coherence forreasons of quality, accountability, or liability.Over-enforcement of workflow steps can some-times impede work, but will aid in main-taining standards.

I Exploration versus instruction. Instruction is aseries of activities designed to impart knowl-edge. This knowledge is often dependent uponinformation previously learned, incrementallybuilding up to more sophisticated understand-ing of complex concepts. Instruction usually

Page 35: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

has set learning goals which have dependenciescreating a full or partial ordering on theirpresentation. Training is a form of instructionwhere knowledge is built up through multipleiterations over time. Exploration assumes thatthere are many paths as well as many goals tobuilding up knowledge. Exploration does nottry to constrain the ordering of or the relation-ships between information. It assumes that ifan exploration path is beyond understanding,a participant will backtrack to more solidground. Also, an exploration approach assumesthat the goal may be unknown and the levelof understanding evolves over time.

I Precision of Model versus Understandability. Aplanning paradox exists (Suchman 1987) in thatthe more precise a workflow model is, the lesslikely it will be followed. Models generallyabstract away detail to focus on specific andrelevant concerns, but sometimes lose the infor-mation that was important. Many graphicalformalisms are able to convey an intuitiveimpression of a work model, but these formal-isms often lack the semantics to support precise,reliable reasoning about them (Osterweil 1997).This graphical pen-and-napkin approach ofdiagramming boxes and arrows to convey aworkflow is adequate as long as access toclarification is immediate, but the participantwill usually find himself pondering ambiguousmarkings once back in the office. Formalismscan be sorted by their ‘maturity’ (Balzer 1994)to choose a formalism adequate for the levelof precision required, or alternative formalismsand their coordinated representations may aidin the understanding for participants withdifferent skill sets.

I Structure versus consistency. The structure ofworkflow representations and data can deter-mine the maintenance costs of keeping theworkflow process and data usable, relevant andconsistent. Highly structured representationsencourage automation, making it easier tomaintain consistency. Tolerating inconsistencyincreases flexibility in the evolution of thesystem because multiple competing or irrecon-cilable representations can be accommodated.This duality is most apparent in computer-executed versus human-executed activities.Tightly constraining human activities mayreduce the ability to accomplish tasks in a

Published in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

159

timely and appropriate manner. Looseningcomputer automated activities may lead toambiguity causing exceptions. The cost of datamaintenance and correction should be weighedin the face of the requirements and character-istics of the problem being solved by theworkflow technology.

I Testing versus evolution. Concerning workflowdeployment, testing of a process before puttingit into use improves the quality of the process,similar to how software testing is performed.The similarity, however, breaks down because,unlike software, workflow processes need tobe contextualized because it is very seldomthat they can be used off the shelf. Testingrequires foreknowledge of the desired workflowpossibly using discovery tools or methods andoften embodies a desire to standardize human-executed activities to adhere to the testedprocess. Evolutionary processes are put intoplace with the understanding that work changesover time. Evolution allows in-place testing andchanges to occur, sometimes even automatically.Testing allows better control over change mech-anisms and pruning unproductive evolutionbranches early before spending resources. Evol-ution allows human or automated optimizationof workflows stemming from greater freedomto change processes to become more applicableto day-to-day work.

4.2. Trends

One trend that is sure to continue, is the utilizationof the WWW to perform communication, collabor-ation and coordination activities (Miller et al. 1997).The Web has been successful in helping accomplishtasks in many domains, bringing non-technicalusers to the network, supporting an enterprisescale computing infrastructure, and advancingcross-platform issues (with the Java VM (Goslinget al. 1996, Gosling and McGilton 1995), legacyaccess (Hamilton and Cattell 1996), and componenttechnologies (Hamilton 1997)). The impact ofexecuting workflow processes allowing exchangeof ideas and encouragement with colleagues locatedthousands of miles apart using the Web is aprecursor for non-traditional group projects andorganization. In order for these ‘virtual work-groups’ to enter into formal commitments to thecompletion of a project, even with the absence

Page 36: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

of interrelationships between participants, severalissues remain to be addressed (Fielding et al.1996b, 1998). Distributing authoring, distributedannotation, linking artifacts and processes (peopleand tasks), visibility and management of artifactsover time, user involvement, group voting andselection issues, and distributed coordination andproject management are all areas where the currentWeb infrastructures of remote Java execution (SunMicrosystems 1997a, b), HTTP extensions and opti-mizations (Fielding et al. 1997), additional protocolspecifications and their supporting tools andenvironments (Kaiser and Whitehead 1997, Sleinet al. 1997, Whitehead 1997), scripting and markup(Berners-Lee and Connolly 1995), and net-basednotification services fall short. Some successes,however, such as the Apache, WebSoft or(potentially) Openscape (see further references)projects, validate the incremental benefits andopen, low-cost adoption to this approach, evenwithout this full set of services such as versioningand transactions (Alonso 1997, Kaiser 1996). Aug-menting work on the Web with ‘diplomacy’between sites or ‘socially driven protocols’(Winograd 1994) between people and agents todetermine agreed-upon formats and activities suchas in a summit/treaty model (Ben-Shaul andIfergan 1997) allows negotiation to determinebest-fit interaction policies without a top-downenforcement model.

The virtual workgroup approach is not too farfrom being reality. While organizational processes(Humphrey 1989) and production workflows havedominated the commercial field in the past decade,the counter trend of adopting groupware and adhoc approaches has broken ground in new direc-tions based on converging technologies and gar-nered interest at large (Abbott and Sarin 1994,Dyson 1990, 1992, Malone and Crowston 1994).Research and concern are being applied to moredynamic and semi-formal work models. Withmore domains migrating to computers and bettertechniques for tracking offline activities and arti-facts, online work models are becoming moresupportive of the unstructured nature of human-to-human and human–computer interactions. Con-vergence between human- and computer-per-formed activities and blending the boundariesbetween these will continue. Researchers arealready extending the continuum. The work todate applying more formal models loosely basedPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

160

on computer execution to describe interactions hascome up against the boundaries of human factors.Already the opposite approach of applying humaninteraction models to agent and system interactionsto better coordinate between human-executed andcomputer-executed activities is an apparent trend(Heimbigner and Wolf 1997, Nuseibeh 1997). Theapproaches of managing inconsistency, informaland intelligence (in the form of automated agents)are being pursued.

Convergence of asynchronous workflow withreal-time groupware can be seen in joint problemsolving systems and support for rapid reorganiza-tion based on communication needs. Researchexperiments grounded in these approaches haveappeared in recent literature (Abowd and Schilit1997, Bandinelli 1995). Increased semantic inte-gration between conversation tools and workflowenvironments instead of blind handoff of controlhas appeared as work in data integration (Lee et al.1996), event-based integration (Object ManagementGroup 1996, Taylor et al. 1966), heterogeneoussystems (Krishnakumar and Sheth 1995) and envel-oping behaviors (Valetto and Kaiser 1995). Someother technological trends that are sure to con-tinue are:

I Lightweight and componentized. User interfacecomponents, workflow execution infrastructureand even tools and their integrations can beswapped to provide incremental scaling ofthe solution and incremental adoption of thetechnology. Lightweight components encouragecustomization and reuse at a fine level ofgranularity fostering the simplest solutions andjust-in-time software component assembly andproblem solving.

I Heterogeneous and open. Support for open hyp-erweb protocols and multiple desktop environ-ments are critical success factors for an elaborat-ive process (Anderson et al. 1994). Open systemsthat provide coherent interfaces to their internalcomponents allow integration with tools inways the original designers had not envisioned.Multiple programming (Maybee et al. 1996, Taft1996) and process languages allow alternativemeans of specification of the work model andits behavior, increasing a system’s flexibility.

I Ubiquitous and mobile. Future computer environ-ments (Abowd 1996, Gilder 1995, Weiser 1991,1993) are intended to allow access to information

Page 37: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

when the user demands, across time and space.Computing technology innovation is providingsolutions that fit the way people work ratherthan the other way around. As PDAs, pagers,kiosks, information terminals, organizers, set-top, laptop, and desktop information and com-munication devices become commonplace, moreand more information will be delivered throughthese channels and mechanisms. This underliesthe need for increased synchronization andbetter control and coordination of the infor-mation as provided by workflow and eventstructures for remote communication, wide-scale notification (Alonso et al. 1997, Barrettet al. 1996, Geppert et al. 1996), and architecturesamenable to distribution across a variety ofplatforms (Perry 1997, Salasin 1997).

I Dynamic and adaptable. In addition to theobstacles of information delivery, transferenceof information between contexts will requireadapting and resolving conflicts and appropri-ateness. Rapid and automatic reorganization,possibly as the result of self-awareness andcontext awareness may be necessary evendynamically during the course of execution.The same infrastructure used to adapt to anexecution context may be able to performdiscovery and evolution of the process inresponse to environmental feedback, thus auto-mating adaptation through task reuse andproviding context specific desktops and views(Abowd et al. 1997, Dey et al. 1998, Lehman1996, Nutt et al. 1999).

I Reflexive and introspective. Workflow adaptationcannot be accomplished without being able toview the state, purpose and context of acomponent. An object or component is intro-spective if it has the ability to assess its ownhealth or integrity and is reflexive if it knowshow to relate and change in response toother components in its environment. Theseproperties, when combined with various stra-tegies and policies, can provide the basisfor simulation, discovery, self-correction, self-organization and self-optimization. Compo-nents which can determine this are able toreason about their applicability and performrich, semantic interoperations with othercomponents (Barghouti and Estublier 1997).Systems such as in Bandinelli et al. (1993a) andBolcer and Taylor (1996) have defined interfaces

Published in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

161

to support reflexive queries or support intro-spection as in Hamilton (1997), but policiesguiding evolution are not yet readily derivable.

I Innovative and invisible interfaces. Task specificinterfaces or end-user domainoriented presen-tations (Fischer et al. 1992) are important forsupporting multiple stakeholders with varyingdegrees of knowledge and expertise. Workflowsystems should be fundamentally integratedwith hypermedia protocols and the researchtrend should continue to explore the interactionbetween these two areas for specifying formaland informal relationships (Nutt et al. 1997).The methods of presentation should hide behindfamiliar paradigms and abstractions rather thanforcing end-users into learning new ones, andresearchers should work to make process moreinvisible (Taylor 1997). That is not to say newinnovative interfaces using technologies suchas VRML and 3D-based systems (Funkhouser1995), shared walkthroughs, electronic black-boards, shared whiteboards and joint problemsolving should not be considered. Processshould not be seen as a goal in itself, but ameans for providing the steps to a solutionto support the end-user goals of the rightinformation at the right time.

4.3. Design Impacts

The WWW is expected to play a major part incommunication, collaboration and coordination ofremote participants. While the Web itself is awidely universal infrastructure, its data-centricapproach and stateless nature of its protocolslimit the task dependencies and sychronization,scheduling and sharing of work. Systems to datethat perform workflow process execution in con-junction with the WWW take a bipolar approach.One approach tightly constrains the actions thatcan be performed by a process stakeholder (suchas a participant, end-user or designer) or softwareagent, but guarantees the internal data consistencyof the workflow model and execution data. Theother allows stakeholders to perform arbitraryactions to accomplish their activities but placesminimal constraints on the structure of the inter-mediate and final data artifacts, providing fewguarantees of consistency. This section gaugesseveral approaches exhibited by systems alongthis spectrum.

Page 38: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

4.3.1. The Database ModelTraditional workflow systems are built on top ofdatabases. Activities are modeled as schema whichinclude values for who (or what role) is responsiblefor the activity, what tools are needed to accomplishit, the relative priority of the activity at hand,and data dependencies, including precedence anddocument handoff. This approach limits the typesof actions that an end-user can perform on eachactivity, as well as the changes that a workflowdesigner can make to the activities. Because theinteractions are limited to pre-specified tools at pre-specified steps in the workflow, data consistency isguaranteed. Database systems also allow trans-action support and rollback of processes to aprevious state of a workflow progression. Thismost often happens when the workflow modelbecomes out of sync with the real activities takingplace. Each of the workflow process steps are data-driven. Specification of the control flow is minimaland often hard-coded into the database schema.Fixing the workflow steps and schema in this wayrequires the process to be stringently adhered to,thus impacting the dynamic evolution capabilitiesof the system, and also strongly constraining thework practices of the performing organization.Also, by restricting the tools and actions allowedto guarantee data consistency, database systemstend to limit their flexibility. End-users cannot usearbitrary tools to accomplish their tasks. It isdifficult to integrate new tools into the process ormaintain local process state external to the database.Database systems typically have a high cost ofadoption and require a critical mass of users toparticipate in using the system before many of theworkflow benefits are seen. There is no incrementaladoption path for easily mixing human performedactivities not stored in the database and those ofthe process model.

4.3.2. Ad Hoc SystemsAn opposite approach tothe database model is thead hoc workflow model. This stems out of theCSCW field and is typified by such commercialsystems as Netscape Collabra (Netscape Communi-cations 1997) and Microsoft Exchange (MicrosoftCorporation 1997). Users are allowed to performtheir work in an ‘ad hoc’ way where solutions andgoals are accomplished with whatever tools anddata are immediately available. These systems areeasily adopted and accessible through the user’sPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

162

desktop or browser. Their primary purpose is toinform users of the current problems, issues anddata needed to accomplish their particular tasks.While this approach tends to be appealing in thatit mimics more closely the fast, furious andunstructured pace of some projects, it providesminimal automation and guidance support. Work-flow models lack a coherent, semantically rich andprecise description of the work being performed.Data and control dependencies, as well as assign-ments, are not immediately viewable, and measur-able status of the project is difficult to ascertain.The ad hoc nature of these systems may cause thedata and artifacts being created and shared to beinconsistent. However, because the changing dataand rationale is recorded over time and is bothviewable and shared by all relevant participants,any data inconsistencies are managed by theparticipants and not the system. This distinctionallows for a broad degree of flexibility inaccomplishing work, but increases the difficultyfor managing it.

4.3.3. Hybrid ApproachesNetworks, not databases, should be the focus ofattention for workflow systems evolution. Underly-ing characteristics of the database and storagemechanisms have largely determined the structureof the workflow or process system. A hybridapproach allows distributed work to take placeindependently with ‘just enough’ synchronizationto maintain workable data consistency, i.e. enforc-ing local constraints or ignoring global ones whenconcerns are more immediate. When data consist-ency becomes stretched, hybrid systems employsome mechanism for resolving inconsistencies,either through replication, negotiation, or mobility.Tolerating temporary deviations and inconsistencyof data in workflow process execution is acceptableand can be managed using various techniques(Cugola et al. 1996, Emmerich et al. 1997, Finkelsteinet al. 1993).

4.3.3.1. Lotus Domino Lotus Domino (the WWWextension to Lotus Notes (Lotus DevelopmentCorporation 1997)) takes a hybrid approach. Dom-ino mixes an underlying messaging and databasesystem with mechanisms to allow stakeholders toview and change data and artifacts through aWWW browser. Data is stored in a Lotus Notesserver, and distributed sites use a replication

Page 39: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

strategy. Domino includes development tools forworkflow development and tool integration, andworkflows are actually deployed as applications.Domino’s support for Java and Java componentsprovides for some post-workflow deployment evol-ution by allowing the presentation or view of theworkflow to be changed. However, the underlyingworkflow model typically does not evolve once theworkflow application has been deployed. Dominoprovides some WWW interfaces for manipulatingand extracting information out of the underlyingrelational database, but typically this is controlledby the workflow application developer and notthe end-user. Domino represents a slightly moreopen approach than the traditional database work-flow systems in that custom views into the work-flow can be created via the WWW through astandard set of APIs. These views, however, areusually fixed before deployment to maintain dataconsistency. The tools integrated into the workflowprocess, as well as the control flow and datadependencies, rarely (if ever) change once theworkflow process begins execution. While this maybe necessary to guarantee data consistency, itrepresents an inhibitor to the usefulness andaccuracy of the system.

4.3.3.2. OzWeb OzWeb (Kaiser et al. 1997) alliesitself much more closely with the WWW-basedworkflow systems. While it too has an underlyingdatabase, it provides a much more semanticallyrich process representation. The OzWeb system ishighly componentized and has specific componentsfor dealing with process descriptions, transactions,rules and constraints, tool integration and manage-ment, and inter-site coordination. Data or eventscan trigger rules and constraints to both proactivelyand reactively drive the workflow process. OzWebsupports evolution in that control flow and dataconstraint rules can be dynamically added, evalu-ated and triggered over HTTP. In order to maintaindistributed data consistency, OzWeb employs asummit and treaty negotiation model. When twodistributed sites encounter a condition where thedata is inconsistent, the system initiates a resolutionmode to reconcile, either automatically or with theinvolvement of human participants, the inconsistentdata. In this way, OzWeb allows the distributedcomponents of a workflow process to proceed withlocal execution while providing coordination andcontrol support. Because OzWeb processes are atPublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

163

its center rules, modification or evolution of arunning process often requires a high degree oftechnical expertise. Those participating in a runningprocess are not always the same stakeholders thathave the ability to change the process to fit theirwork context or habits. While rule-based systemsprovide excellent support for automation, theycreate a mismatch between those developing ordefining the workflows and those who use orbenefit from it. This limits the types of end-usercustomizations that can be made to an executingworkflow system.

4.3.3.3. Endeavors Workflow The Endeavors (Bolcerand Taylor 1996) philosophy is derived directlyfrom the WWW-based model. In a similar mannerto an end-user managing the exception whenencountering a ‘page not found’ error on theWWW, Endeavors places the burden of resolutioninitially with the end-user. The end-user thenhas the option to employ automated or manualworkarounds to the exception or inconsistency.Endeavors, however, differs from ad hoc workflowsystems in that it includes a semantically richprocess description language. This allows highlystructured process modeling and execution capa-bilities to be integrated into the workflow whilenot being so constraining that ad hoc work isinhibited. Endeavors processes allow multiple,customizable views. The default process viewallows both nontechnical and technical users tovisually edit the workflow. Endeavors is also ahighly componentized system. In addition to beingable to reference and embed various Endeavorscomponents into WWW pages and email, variousprocess fragments including their data and toolscan be sent and executed using HTTP. Endeavorsis flexible in its approach. Depending on how thesystem is customized, the level of data consistencyenforcement allows Endeavors to be used to eitherprimarily inform the users of their data andactivities as in an ad hoc system or automate certainactivities based on their inputs as in a standardworkflow system. Similarly, Endeavors’ policiescan be customized to enforce the workflow processexecution, by limiting the user’s interactions thuskeeping the data consistent, or as a supportinfrastructure to enhance the coordination capabili-ties between users by loosening these constraints.

Page 40: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

5. CONCLUSIONS

Ubiquity of WWW and Java lends itself well toallowing participation by remote people and teamsin a workflow. While these technologies alone areinsufficient for addressing all the requirements ofan advanced workflow system, they provide agrounding point for incrementally adding incooperation and collaboration policies in order todemonstrate the feasibility of some of theapproaches mentioned for providing distributedworkflow functionality. Web protocols and Javaallow the infrastructure to be open and extendible,thus providing a mechanism for integration withcurrent and evolving Internet technologies includ-ing security, data versioning, document modelingand management, broadcast, notification and otherdistribution or collaboration protocols.

Flexibility will continue to be an emphasis inthe development of workflow systems. Networksof servers, clients and workflow processes aredynamic structures that must tolerate discontinuity,change and inconsistency. Workflow participantswill likely find it necessary to disconnect from thenetwork for a period of time, to travel with alaptop computer for example, and continue theirindividual activities even using low-bandwidth ordisconnected clients (Skopp 1995). A workflowinfrastructure should provide the mechanisms toassign, synchronize and reconcile a workflowparticipant’s state with that of other stakeholdersacross a wide variety of traditional and non-traditional computing platforms. Changes by theparticipants doing the work even while the workis being performed will be supported and encour-aged as the technologies mature to encourage low-cost adoption, extension and adaptation to newproblems. Workflow design will emphasize supportfor virtual workgroups through rapid development,customization and reuse. Non-technical parti-cipants will specify and evolve their own workflowsthrough clever graphical cutting-and-pasting ofworkflow representations and their behaviors simi-lar to techniques used now in informal editing ofdirected graphs (Ibrahim 1997). Complex processeswill be easily instantiated and deployed from aprocess handbook (Malone et al. 1995) or purchasedin an open marketplace (Whitehead et al. 1995). Asadvanced workflow system infrastructures beginsto embrace the ingenuity, creativity and dynamismof how human workers accomplish tasks, thesePublished in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

164

systems will help groups become more efficientand effective in their daily activities through bettercommunication and coordination.

ACKNOWLEDGEMENTS

The authors would like to acknowledge the mem-bers of the UCI EDCS projects, particularly RoyFielding, Jim Whitehead, Ken Anderson, PeymanOreizy and Rohit Khare, in addition to the Ende-avors project team members Clay Cover, ArtHitomi, Ed Kraemer and Adam Hampson, whosediscussions and interactions have provided anextensive, idea rich environment. Special thanks isreserved for Mark Bergman and Peter Kammer forhelping shape the issues, outline and arguments,and, of course, David Rosenblum, David Redmilesand Michael Leon for their review and comments.

REFERENCES

Abbott K. and S. Sarin. 1994. Experiences with workflowmanagement: issues for the next generation. Proceedingsof the Conference on CSCW, Chapel Hill, NC, pp.113–120.

Abowd, G. 1996. Ubiquitous computing: research themesand open issues from an applications perspective.Technical Report GIT-GVU 96-24, GVU Center, GeorgiaInstitute of Technology, October.

Abowd, G., C. G. Atkeson, J. Hong, S. Long, R. Kooperand M. Pinkerton. 1997. CyberGuide: a mobile context-aware tour guide. ACM Wireless Networks, March.

Abowd, G., A. K. Dey and A. Wood. 1998. Applyingdynamic integration as a software infrastructure forcontext-aware computing. GVU Technical Report GIT-GVU-98-01, February.

Abowd, G. and B. Schilit. 1997. Ubiquitous computing:the impact on future interaction paradigms and HCIresearch. CHI’97 Workshop, March.

Ackerman, M. and D. McDonald. 1996. Answer Garden2:merging organizational memory with collaborative help.CSCW 96 Proceedings, ACM.

Ackerman, M. 1994. A field study of answer garden.CSCW 94 Proceedings, ACM, pp. 243–252.

Page 41: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

Action Technologies, 1998. Enterprise Workflow Tools.http://actiontech.com/.

Alonso, G. 1997. Processes 1 transactions = distributedapplications. Proceedings of the High Performance Trans-action Processing (HPTS) Workshop at Asilomar, Califor-nia, 14–17 September.

Alonso, G., C. Hagen, H.-J. Schek and M. Tresch. 1997.Towards a Platform for Distributed Application Development.NATO Advance Studies Institute (ASI), Istanbul, Tur-key, August.

Ambriola, V., P. Ciancarini and C. Montangero. 1990.Software process enactment in Oikos. Proceedings ofthe Fourth ACM SIGSOFT Symposium on SoftwareDevelopment Environments, Irvine, CA, pp. 183–192.

Anderson, K., R. Taylor and E. J. Whitehead. 1994.Chimera: hypertext for heterogeneous software environ-ments. European Conference on Hypermedia Technology(ECHT’94), Edinburgh, Scotland, September.

Balzer, R. 1994. Process definition formalism maturitylevels. Proceedings of the 9th International SoftwareProcess Workshop, pp. 132–133, October.

Bandinelli, S. 1995. Report on the workshop on softwareprocess architectures. Milan, 20–23 March.

Bandinelli, S., A. Fuggetta, C. Ghezzi and S. Grigolli.1992. Process enactment in SPADE. European Workshopon Software Process Technology, Springer Verlag, pp.66–82.

Bandinelli, S., A. Fuggetta and C. Ghezzi. 1993a. Softwareprocess model evolution in the SPADE environment.IEEE Transactions on Software Engineering, 19, No 12.December.

Bandinelli, S., A. Fuggetta and S. Grigolli. 1993b. Processmodeling in-the-large with SLANG. In Proceedings ofthe Second International Conference on the SoftwareProcess, pp. 75–83.

Barghouti, N. and G. Kaiser. 1990. Modeling concurrencyin rule-based development environments, IEEE Expert,December.

Barghouti, N. and S. Arbaoui. 1996. Process support andworkflow management with improvise. NSF Workshopon Workflow and Process Automation in InformationSystems: State-of-the-Art and Future Directions, May,Athens, GA.

Barghouti, N. and J. Estublier. 1997. Semantic interop-erability of process-sensitive systems. AT&T Laboratoricsand Adele/LSR, France, Research Direction in ProcessTechnology Workshop, Nancy, France, July.

Barghouti, N., E. Koutsofios and E. Cohen. 1995. Impro-

Published in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

165

vise: interactive multimedia process visualizationenvironment. Fifth European Software Engineering Con-ference (ESEC’95), Published as Lecture Notes in Com-puter Science, W. Schafer (Ed.), Springer-Verlag, Sep-tember.

Barrett, D., L. Clarke, P. Tarr and A. Wise. 1996. Anevent-based software integration framework. TechnicalReport 95-048 Computer Science Department, Universityof Massachusetts at Amherst, Revised 31 January.

Ben-Shaul, I. and S. Ifergan. WebRule: an event-basedframework for active collaboration among web servers.1997. Proceedings of The Sixth International World WideWeb Conference, pp. 521–531, Santa Clara, California,USA, April.

Bergman, M. 1996. Criteria for workflow systems. UCITechnical Survey, December.

Berners-Lee, T. 1994. Universal resource identifiers inWWW: a unifying syntax for the expression of namesand addresses of objects on the network as used inthe World-Wide Web. Internet Engineering Task Force(IETF), draft document, RFC 1630, CERN, June.

Berners-Lee, T. and D. Connolly. 1995. HyperTextmarkup language specification – 2.0. Internet EngineeringTask Force, draft document, Work in Progress,MIT/W3C, June.

Berners-Lee, T., R. Fielding and H. Frystyk. 1996.Hypertext transfer protocol – HTTP/1.0. Internet Infor-mational RFC 1945. MIT/LCS, UC Irvine, May.http://www.ics.uci.edu/pub/ietf/http/rfc1945.txt

Berners-Lee, T., L. Masinter and M. McCahill. 1994.Uniform resource locators (URL). RFC 1738, CERN,Xerox, University of Minnesota, December.ftp://ds.internic.net/rfc/rfc1738.txt

Blair, J. (ed). 1981. Office automation systems: why somework and others fail. Stanford University ConferenceProceedings, Stanford University, Center for Infor-mation Technology.

Boehm, B. 1988. A spiral model of software developmentand enhancement. IEEE Computer, 21(2), pp. 61–72.

Bogia, D. 1995. Supporting flexible, extensible taskdescriptions in and among tasks. PhD Thesis, Universityof Illinois at Urbana-Champaign, Technical Report#UIUCDCS-R-95-1925.

Bogia, D. and S. Kaplan. 1995. Flexibility and controlfor dynamic workflows in the wOrlds environment.ACM Conference on Organizational Computing Systems,pp. 148–159, Milpitas, CA.

Bolcer, G. and R. Taylor. 1996. Endeavors: a processsystem integration infrastructure. In 4th International

Page 42: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

Software Process Workshop, pp. 76–85, Brighton, U.K.,IEEE Computer Society Press, December.

Broy, M. 1985. Control Flow and Data Flow: Concepts ofDistributed Programming. Springer Verlag.

Chen. P. 1983. Entity-Relationship Approach to InformationModeling and Analysis. North Holland, Amsterdam.

Christie, A. 1994. A practical guide to the technologyand adoption of software process automation. SoftwareEngineering Institute, CMU/SEI-94-TR-007, March.

Christie, A. 1995. Software Process Automation – theTechnology and its Adoption. Springer Verlag.

Christie, A., L. Levine, E. Morris et al. 1997. Softwareprocess automation: interviews, survey, and workshopresults. CMU/SEI Technical report SEI-97-TR-008, ESC-TR-97-008, October.

Clough, A. 1993. Choosing an appropriate processmodeling technology. CASE Outlook, 7(1), pp. 9–27.

Conradi, R., C. Fernstrom and A. Fuggetta. 1992. Towardsa reference framework for process concepts. In SoftwareProcess Technology, Springer Verlag pp. 3–17.

Craig, I. 1997. Converting declarative into procedural(and vice-versa). Technical Report, Department of Com-puter Science, Reflective Architectures in the VIM Project,University of Warwick Coventry.

Cugola, G., E. Di Nitto, A. Fuggetta and C. Ghezzi.1996. A framework for formalizing inconsistencies anddeviations in human-centered systems. ACM Transactionson Software Engineering and Methodology (TOSEM), 5(3),pp. 191–230.

Daigle, L. et al. 1995. Uniform Resource Agents (URAs).Internet Engineering Task Force (IETF), draft docu-ment, November.

Dey, A., G. Abowd and A. Wood. 1998. CyberDesk: aframework for providing self-integrating context-awareservices. Proceedings of Intelligent User Interfaces ‘98(IUI ‘98), January.

Dimitrios, G., M. Hornick and A. Sheth. 1995. Anoverview of workflow management: from process mode-ling to workflow automation infrastructure. Distributedand Parallel Databases, 3(2), pp. 117–153.

Dourish, P. 1995. Developing a reflective model ofcollaborative systems. ACM Transactions on Computer-Human Interactions, 2(1), pp. 40–63.

Dyson, E. 1990. Why groupware is gaining ground.Datamation, March, pp. 52–56.

Published in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

166

Dyson, E. 1992. Workflow. Technical report, EDventureHoldings, September.

Ehrlich, S. 1987. Strategies for encouraging successfuladoption of office communication systems. ACM Trans-actions on Office Information Systems, pp. 340–357.

Ellis, C. 1979. Information control nets: a mathematicalmodel of office information flow. Proceedings of the1979 ACM Conference on Simulation, Measurement andModeling of Computer Systems, 1979.

Ellis, C. and Nutt, G. 1996. Workflow: the processspectrum. NSF Workshop on Workflow and ProcessAutomation in Information Systems: State-of-the-Art andFuture Directions. Athens, Georgia, pp. 140–145, May.

Emmerich, W. 1991. MERLIN: knowledge-based processmodeling. In First European Workshop on SoftwareProcess Modeling, Associazione Italiana per I’Informaticaed il Calcolo Automatico, Milan, Italy.

Emmerich, W. et al. 1997. Standards compliant softwaredevelopment. Proceedings of the International Confer-ence of Software Engineering Workshop on Living withInconsistency, IEEE.

Estublier, J. 1996. APEL: Abstract Process Engine Langu-age. University of Grenoble, France, 2nd Process Tech-nology Workshop. 20–23 February at U.C. Irvine.

Fernstrom, C. 1991. The Eureka Software Factory: Conceptsand Accomplishments, 3rd European Software EngineeringConference, Milan, Italy, October.

Fernstrom, C. 1993. Process weaver: adding processsupport to UNIX. Proceedings of the Second InternationalConference on the Software Process.

Fernstrom, C. and L. Ohlsson. 1991. Integration needsin process centered environments. First InternationalConference on the Software Process, IEEE Press, RedondoBeach, CA, October.

Fielding, R., J. Gettys, J. Mogul et al. 1996a. HypertextTransfer Protocol – http/1.1. Internet Engineering TaskForce (IETF), draft document, 23 April.

Fielding, R., E. J. Whitehead, K. Anderson, G. Boker, P.Oreizy and R. Taylor. 1996b. Software engineering andthe cobbler’s barefoot children, revisited. TechnicalReport, EDCS Software Group, University of California,Irvine, November.

Fielding, R., E. J. Whitehead, K. Anderson, G. Boker, P.Oreizy and R. Taylor. 1998. Web-based development ofcomplex information products. Communications of theACM, 41(8), August.

Fielding, R., J. Gettys, J. Mogul, H. Nielsen and T. Berners-Lee. 1997. Hypertext transfer protocol – HTTP/1.1. RFC

Page 43: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

2068. UC Irvine, DEC, MIT/LCS, January.http://www.ics.uci.edu/pub/ietf/http/rfc2068.txt

Finkelstein, A., D. Gabbay, A. Hunter, J. Kramer andB. Nuseibeh. 1993. Inconsistency handling in multi-perspective specifications. Fourth European SoftwareEngineering Conference (ESEC’93), Garmish, Germany,September.

Fischer, G., A. Girgensohn, K. Nakakoji and D. Redmiles.1992. Supporting software designers with integrateddomain-oriented design environments. IEEE Transactionson Software Engineering, 18(60), pp. 511–522.

Fitzpatrick, G., S. Kaplan and T. Mansfield. 1996. Physicalspaces, virtual places and social worlds: a study of workin the virtual. CSCW’96, pp. 334–343, Boston, MA.

Fitzpatrick, G., W. Tolone and S. Kaplan. 1995. Work,locales and distributed social wOrlds. In Proceedings ofthe 1995 European Conference on Computer SupportedCooperative Work (ECSCW ‘95). pp. 1–16, Stockholm,September.

Flores, F. 1979. Management and communication inthe office of the future. PhD Thesis, Department ofPhilosophy, University of California at Berkeley.

Fosdick, L. and L. Osterweil. 1976. Data flow analysisin software reliability. Computing Surveys, 8(3), pp.305–330.

Fuggetta, A. 1996. Architectural and linguistic issues insupporting cooperation and interaction. Politecnico diMilano, 2nd Process Technology Workshop, 20–23 Febru-ary at U.C. Irvine.

Fuggetta, A. and C. Ghezzi. 1994. State of the art andopen issues in process-centered software engineeringenvironments. Journal of Systems and Software, 26, pp.53–60.

Funkhouser, T. 1995. RING: A client-server systemfor multi-user virtual environments. Symposium onInteractive 3D Graphics, pp. 85–92.

Geppert, A., M. Kradolfer and D. Tombros. 1996. EvE,an event engine for workflow enactment in distributedenvironments. Technical Report 96.05, Department ofComputer Science, University of Zurich, May.

Gilder, G. 1995. The coming software shift. Telecosm,Forbes ASAP Magazine. August.

Gosling, J., B. Joy and G. Steele. 1996. The Java LanguageSpecification. Addison-Wesley, August. http://java.sun.com/docs/books/jls/html/index.html

Gosling, J. and McGilton, H. The Java(tm) LanguageEnvironment: A White Paper, Sun Microsystems Inc.,October.

Published in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

167

Griff, A. and G. Nutt. 1997. Tailorable location policies fordistributed object services. Technical report, December.

Grudin, J. 1988. Why CSCW applications fail: problemsin design and evaluation of organizational interfaces.CSCW 88 Proceedings, ACM, pp. 85–93.

Grudin, J. and L. Palen. 1995. Why groupware succeeds:discretion or mandate? In Proceedings of the 4th Euro-pean Conference on Computer-Supported CooperativeWork, pp. 263–278, Kluwer Academic Publishers.

Gruhn, V. 1994. Business process modeling in theworkflow management environment LEU. Proceedingsof the Entity-Relationship Conference, Manchester, UK,December.

Gruhn, V. and R. Jegelka. 1992. An evaluation ofFUNSOFT nets. In Software Process Technology, pp. 3–17,Springer Verlag.

Gruhn, V. and S. Wolf. 1995. Software process improve-ment by business process orientation. Software Process–Improvement and Practice, Pilot Issue, 49–56.

Hamilton, G. (ed). 1997. JavaBeans, revision 1.01, SunMicrosystems, July. http://java.sun.com/beans/spec.html

Hamilton, G. and R. Cattell. 1996. JDBC(TM): A JavaSQL API. JavaSoft White Paper, 4 April.

Hammer, M. and J. Champy. 1993. Reengineering theCorporation: A Manifesto for Business Revolution. Harper,New York.

Hanks, S., M. Pollack and P. Cohen. 1993. Benchmarks,test beds, controlled experimentation and the design ofagent architectures. AI Magazine, Winter.

Harel, D. 1987. Statecharts: a visual formalism forcomplex systems. Science of Computer Programming, 8,231–274.

Heimann, P., G. Joeris, C. Krapp and B. Westfechtel1996. DYNAMITE: dynamic task nets for software processmanagement. Proceedings of the 18th InternationalConference on Software Engineering, pp. 331–341, March.

Heimbigner, D. 1989. P4: a logic language for processprogramming. Proceedings of the 5th InternationalSoftware Process Workshop, pp. 67–70, Kennebunk-port, Maine.

Heimbigner, D. 1992. The ProcessWall: a process stateserver approach to process programming. In Proceedingsof the Fifth ACM SIGSOFT/SIGPLAN Symposium onSoftware Development Environments, Washington, D.C.,9–11 December.

Heimbigner, D. 1996. The ProcessWall: a process state

Page 44: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

server approach to process programming (revised).University of Colorado, Boulder, 2nd Process TechnologyWorkshop, 20–23 February at U.C. Irvine.

Heimbigner, D. and A. Wolf. 1997. Micro-processes.Software Engineering Research Laboratory, Universityof Colorado, Boulder, Research Directions in ProcessTechnology Workshop, Nancy, France, July.

Heineman, G. and G. Kaiser. 1996. The CORD approachto extensible concurrency control. Columbia UniversityDepartment of Computer Science, Technical ReportCUCS-024095, February.

Hubert, L. 1991. Eureka software factory – OPIUM – anenvironment for software process modeling integratedwith-project management and product managementfacilities. In First European Workshop on SoftwareProcess Modeling, May.

Humphrey, W. 1989. Managing the Software Process.Addison-Wesley, Reading, MA.

Hyun, M., E. Miller, J. Phillips and R. Smith. 1995.Cognitive architecture project. Agent Properties, on theWWW http://krusty.eecs.umich.edu/cogarch2/index.html

Ibrahim, B. 1997. Optimizing cut-and-paste operationsin directed-graph editing. HCI International ‘97, SanFrancisco, California, August, 1997.

Jaccheri, M. and R. Conradi. 1993. Techniques for processmodel evolution in EPOS. IEEE Transactions of SoftwareEngineering, 12 (Special issue on Software ProcessEvolution), 1145–1156.

Junkermann, G. and W. Schaefer. 1994. Merlin: support-ing cooperation in software development through aknowledge-based environment. In Advances in SoftwareProcess Technology, Wiley.

Kadia, R. 1992. Issues encountered in building a flexiblesoftware development environment: lessons from theArcadia project. In Proceedings of ACM SIGSOFT FifthSymposium on Software Development Environments,pp. 169–180, Tyson’s Corner, VA, December.

Kadia, R. 1995. The Collected Arcadia Papers on SoftwareEngineering Environment Infrastructure. Vols 1 and 2,Second Edition, University of California, Irvine.

Kaiser, G. 1989. Rule-based modeling of the softwaredevelopment process. Proceedings of the 4th Inter-national Software Process Workshop, October 1988.Published in ACM SIGSOFT Software Engineering Notes,June.

Kaiser, G. 1996. Talks on OzWeb, DkWeb, Amber,Pern, CORD, summits and treaties. Columbia University

Published in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

168

Department of Computer Science, 2nd Process Tech-nology Workshop, 20–23 February at U.C. Irvine.

Kaiser, G, S. Dossick, W. Jiang and J. Yang. 1997. Anarchitecture for WWW-based hypercode environments.In Proceedings of the 19th International Conference onSoftware Engineering, pp. 3–13, Boston, Mass., USA, May.

Kaiser, G. et al. 1998a. WWW-based collaborationenvironments with distributed tool services. World WideWeb Journal, 1, Baltzer Science Publishers, January.

Kaiser, G., S. Dossick, W. Jiang and J. Yang. 1998b. WWW-based collaboration environments with distributed toolservices. World Wide Web Journal, 1, 3–25.

Kaiser, G. and E. Whitehead. 1997. Collaborative work:distributed authoring and versioning. Column IEEEInternet Computing, March/April, 76–77.

Kaplan, S., G. Fitzpatrick. T. Mansfield and W. Tolone.1997. MUDdling Through. In Proceedings ThirtiethAnnual Hawaii International Conference on SystemsSciences (HICSS’97), vol. II, IEEE Computer SocietyPress, pp. 539–548, January.

Kellner, M. 1991. Software process modeling supportmanagement planning and control. In Proceedings of theFirst International Conference on the Software Process.

Kling, R. 1991. Cooperation, coordination, and controlin computer supported work. Communications of theACM, 34(12), 83–88.

Kobialka, H.-U. and A. Gnedina. 1995. Customizingdynamics in configuration management systems, 2ndInternational Conference on Concurrent Engineering, pp.557–564, August.

Kraemer, K. and J. King. 1988. Computer-based systemsfor cooperative work and group decision making. ACMComputing Surveys, 20(2), 115–146.

Krishnakumar, N. and A. Sheth. 1995. Managing hetero-geneous multi-system tasks to support enterprise-wideoperations. Distributed and Parallel Databases, 3(2), 155–186.

Lai, K.-Y., T. Malone and K.-C. Yu. 1988. Object lens: a‘spreadsheet’ for cooperative work. ACM Transactions ofOffice Information Systems, 6, 332–353.

Lee, J., Y. Gregg and PIF Working group. 1996. The PIFprocess interchange format and framework. Technicalreport, University of Hawaii Information and ComputerScience Department, February.

Lehman, M. 1996. Laws of software evolution revisited.Proceedings of 5th European Workshop of SoftwareProcess Technology, EWSPT’96, pp. 108–124, Nancy,France, October.

Page 45: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

Lotus Development Corporation, 1997. Domino.Doc White-paper, Lotus Development Corp, April.http://www2.lotus.com/domino/doc.nsf/

Markus, M. and T. Connolly. 1990. Why CSCW appli-cations fail: problems in the adoption of interdependentwork tools. CSCW 90 Proceedings, ACM, pp. 371–380.

Malone, T. and K. Crowston. 1994. The interdisciplinarystudy of coordination. ACM Computing Surveys, 26(1),87–119.

Malone, T., K. Crowston, J. Lee and B. Pentland. 1995.Tools for inventing organizations: toward a handbookof organizational processes. Workflow 95 ConferenceProceedings, pp. 57–78.

Malone, T. and K. Crowston. 1990. What is coordinationtheory and how can it help design cooperative worksystems? Proceedings of the Conference in Computer-Supported Cooperative Work, Los Angeles, CA, pp.357–370, October.

Maybee, M., D. Heimbigner and L. Osterweil. 1996.Multilanguage interoperability in distributed systems:EXPERIENCE REPORT. Proceedings of the 18th Inter-national Conference on Software Engineering (ICSE-18),pp. 451–463, March.

McCrickard, D. 1997. Information awareness on thedesktop: a case study. Technical Report GIT-GVU-97-03,GVU Center, Georgia Institute of Technology, March.

Microsoft Corporation. 1997. Exchange Server, http://www.microsoft.com/products/pro-dref/599Fov.htm

Miller, J., A, Sheth, K. Kochut and D. Palaniswami. 1997.The future of web-based workflows. LSDIS, Departmentof Computer Science, University of Georgia, Athens,Research Direction in Process Technology Workshop,Nancy, France, July.

Netscape Communications. 1997. Netscape collabra ser-ver 3.0, Open Discussion Server for Enterprise Collabor-ation, Datasheet. Netscape Communications Inc.http://home.netscape.com /comprod/server central/product/news/collabra3Fdata.html

Nutt, G. 1996. The evolution toward flexible workflowsystems. Distributed Systems Engineering, 3(4) 276–294.

Nutt, G., S. Brandt, A. Griff and S. Siewert. 1999.Dynamically negotiated resource management for virtualenvironment applications. IEEE Transactions on Knowledgeand Data Engineering, in press.

Nutt, G., T. Berk., S. Brandt, M. Humphrey and S. Siewert.1997. Resource management for a virtual planning room.1997 International Workshop on Multimedia InformationSystems, pp. 129–134, Como, Italy, September.

Published in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

169

Nuseibeh, B. 1997. Software processes are human too.Department of Computing, Imperial College, London,Research Direction in Process Technology Workshop,Nancy, France, July.

Object Management Group. 1996. The common objectrequest broker: architecture and specification, revision2.0. July. http://www.omg.org/corba/corbiiop.htm

Osterweil, L. 1987. Software processes are software too.In Proceedings of the Ninth International Conference ofSoftware Engineering, Monterey, CA, March.

Osterweil, L. 1997. Software processes are software too,revisited. In Proceedings of the International Conferenceon Software Engineering ‘98, Boston, MA, pp. 540–548, May.

Osterweil, L. and S. Sutton. 1996. Factored languagearchitecture (of Julia), University of Massachusetts,Amherst, 2nd Process Technology Workshop, 20–23February at U.C. Irvine.

Perry, D. 1997. Direction in process technology – anarchitectural perspective, Bell Laboratories, ResearchDirection in Process Technology Workshop, Nancy,France, July.

Peterson, J. 1981. Petri Net Theory and the Modeling ofSystems, Prentice-Hall.

Saastamoinen, H., M. Markkanen and V. Savolainen.1994. Survey on exceptions in office information systems,Technical Report CU-CS-712-95, Department of Com-puter Science, University of Colorado, Boulder.

Sachs, P. 1995. Transforming work: collaboration, learn-ing, and design. Communications of the ACM, 38(9), 36–44.

Salasin, J. Overview: military applications of futuresoftware technologies. DARPA briefing.

Simon, H. 1981. The Sciences of the Artificial, 2nd edition,MIT Press, Cambridge, MA.

Skopp, P. 1995. Low bandwidth operation in a multi-user software development environment. Columbia Uni-versity Department of Computer Science, Master’s Thesis.

Slein, J., F. Vitali, E. Whitehead Jr. and D. Durand. 1997.Requirements for distributed authoring and versioningon the world wide web. Internet-draft, work-in-progress,July. ftp://ds.internic.net/internet-drafts/draft-ietf-web-dav-requirements-01.txt

Song, X. and L. Osterweil. Engineering software designprocesses to guide process execution. Proceedings of the3rd International Conference on the Software Process,Reston, VA, October, pp. 135–152.

Suchman, L. 1983. Office procedure as practical action:

Page 46: Advanced workflow management technologies

Research Section G. A. Bolcer and R. N. Taylor

models of work and system design. ACM Transactionson Office Information Systems, 1(4), 320–328.

Suchman, L. 1987. Plans and Situated Actions: The Problemsof Human-Machine Communication. Addison-Wesley,Reading, MA.

Sun Microsystems. 1997a. Java Remote Method InvocationSpecification. Sun MicroSystems Inc. http://www.javasoft.com/products/jdk/1.1/docs/guide/rmi/spec/rmiTOC.doc.html

Sun Microsystems. 1997b. The Java servlet API. Whitepaper.Sun MicroSystems Inc. http://jserv.javasoft.com/products/java-server/webserver/fcs/doc/servlets/api.html

Sutton, S., Jr., D. Heimbigner and L. Osterweil. 1995.APPL/A: a language for software process programming.ACM Transactions on Software Engineering and Methodology,4(3), pp. 221–286

Sutton, S. Jr. and L. Osterweil. 1997. The design of anext-generation process language. Proceedings of theFifth ACM SIGSOFT Symposium on the Foundations ofSoftware Engineering, Zurich, Switzerland, September.Lecture Notes in Computer Science #1301, pp. 142–158.

Swenson, K. and K. Irwin. 1995. Workflow technology:tradeoffs for business process re-engineering. Conferenceon Organizational Computing Systems, pp. 22–29.

Taft, T. 1996. Programming the Internet in Ada95. Draftsubmitted to Ada Europe ‘96, 15 March.

Tarr, P. and L. Clarke. 1993. PLEIADES: an objectmanagement system for software engineering environ-ments. Proceedings of the First ACM SIGSOFT Sym-posium on the Foundations of Software Engineering,pp. 56–70, December.

Taylor, F. W. 1911. Principles of Scientific Management.Harper & Row, New York.

Taylor, R. 1997. Dynamic, invisible, and on the Web.University of California Irvine, Research Direction inProcess Technology Workshop, Nancy, France, July.

Taylor, R., F. Belz, L. Clarke, L. Osterweil, R. Selby, J.Wilden, A. Wolf and M. Young. 1988. Foundations forthe Arcadia Environment Architecture. In Proceedingsof the ACM SIGSOFT/SIGPLAN Software EngineeringSymposium on Practical Software Development Environ-ments, pp. 1–13, ACM Press, November.

Taylor, R. et al. 1996. A component- and message-basedarchitectural style for GUI software. IEEE Transactionson Software Engineering, 22(6).

Published in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

170

Taylor, R. et al. 1995. Chiron-1: a software architecturefor user interface development, maintenance, and run-time support. ACM Transactions on Computer-HumanInteraction, 2(2), pp. 105–144.

Tolone, W. 1994. Introspect: a meta-level specificationframework for dynamic, evolvable collaboration support.PhD Thesis, University of Illinois at Urbana-Champaign.

Valetto, G. and G. Kaiser 1995. Enveloping sophisticatedtools into process-centered environments. In AutomatedSoftware Engineering, Kluwer Academic Publishers.

Weiser, M. 1991. The computer of the 21st century.Scientific American, 265(3), 66–75.

Weiser, M. 1993. Some computer science issues inubiqutous computing. Communications of the ACM, 36(7),75–84.

Whitehead, E. 1997. World Wide Web Distributedauthoring and versioning (WebDAV): an introduction.ACM Standard View, 5(1), 3–8.

Whitehead, E. et al. 1995. Software architecture: foun-dation of a software component marketplace. Proceedingsof the First International Workshop on Architectures forSoftware Systems in cooperation with ICSE-17, Seattle,Washington, 24–25 April.

Winograd, T. 1994. Categories, disciplines, and socialcoordination. Computer Supported Cooperative Work, 2,191–197.

Yang, J. and G. Kaiser. 1998. WebPern: an extensibletransaction server for the world wide web. Technicalreport CUCS-004-98, Columbia University Departmentof Computer Science, January.

Young, P. 1994. Customizable process specification andenactment for technical and non-technical users. PhDDissertation, University of California Irvine.

Young, P. and R. Taylor. 1994. Human-executed oper-ations in the teamware process programming system.Proceedings of the Ninth International Software Pro-cess Workshop.

Young, P. and R. Taylor. 1995. Process programminglanguages: issues and approaches. 17th InternationalConference on Software Engineering, Workshop onSoftware Engineering and Programming Languages,Seattle, Washington, 23–30 April.

Winograd, T. and F. Flores. 1987. Understanding Computersand Cognition. Addison-Wesley.

Page 47: Advanced workflow management technologies

Research Section Advanced Workflow Management Technologies

WEB CITATIONS AND FURTHERREFERENCES

[ActionTech] – http://www.actiontech.com/ –Action Technologies[Advanced Workflow] – http://www.endtech.com/papers/AdvancedWorkflow.pdf –this document[Apache] – http://www.apache.org/ – ApacheWeb Server Project[C2] – http://www.ics.uci.edu/pub/c2/ – Chiron-2 Software Architecture[Chimera] –http://www.ics.uci.edu/pub/chimera/ – ChimeraOpen Hypertext[DARPA-EDCS] –http://www.darpa.mil/ito/research/edcs/ –DARPA’s Evolutionary Design of ComplexSoftware – http://www.darpa.mil/ito/ – Infor-mation Technology[Endeavors] –http://www.ics.uci.edu/pub/endeavors/ –Advanced Workflow[Java] – http://www.javasoft.com/ – Javasoft[LSDIS] – http://Isdis.cs.uga.edu/ – Large ScaleDistributed Information Systems, University ofGeorgia[Openscape] – http://www.openscape.org/ –Openscape WWW Browser Project[PSL] – http://www.psl.cs.columbia.edu/ – Pro-gramming Systems Lab, Columbia University[SEI] – http://www.sei.cmu.edu/ – SoftwareEngineering Institute[W3C] – http://www.w3.org/ – The World WideWeb Consortium[WcbDAV] – http://www.ics.uci.edu/°ejw/authoring/ – Distributed Authoring and Versioningon the WWW, IETF Working Group[WebSoft] – http://www.ics.uci.edu/pub/websoft/ – WWW Software and Protocols

Published in 1998 by John Wiley & Sons, Ltd. Softw. Process Improve. Pract. 4, 125–171 (1998)

171

[WfMC] – http://www.wfmc.org/ – WorkflowManagement Coalition

APPENDIX: DEFINITIONS AND USAGE

Advanced workflow: a body of technologies andapproaches that makes a philosophical assumptionabout the evolutionary nature of cooperative workin that it is most elegantly accomplished when itcan be described, shared, visualized, distributed,dynamically changed and interconnected.Collaboration: a joint intellectual effort in whichsharing of information is required.Communication: exchange of thoughts, messages, orinformation; or the mechanisms by which theyare transmitted.Cooperation: the act of association for furthering agoal that is of mutual benefit.Coordination: the act of managing interdependenciesbetween activities performed to achieve a goal.Elegance: the information theoretic principle thatno other construct can be proven to achieve thesame goal with the same constraints; usually size,time or cost.Evolution: a process that involves change overtime usually in the face of a more efficient orelegant solution.Market: a public gathering for exchange of merchan-dise or ideas.Monotonic: incremental change that does not invali-date previous changes and maintains their consist-ency.Procedure: a style of performing or effecting a goal.Process: a series of actions, changes or functionstoward accomplishing a goal.Stakeholder: one who has a stake, interest, rewardor penalty in the outcome of an endeavor.Workflow: the interaction of humans and computerprocesses to generate and share information.