setting priorities for dss development · 2016. 1. 13. · keywords:decision support systems,...

13
DDS in Organizations Setting Priorities for DSSDevelopment By: C. Lawrence Meador Martin J. Guyote Research & Planning, Inc. 215 First Street Cambridge, Massachusetts 02142 By: Peter G.W. Keen Informational Technology Services, Inc. One Forks Road Lexington, Massachusetts 02123 Abstract Traditional project management and design methods used for data processing and MIS applications are ill- suited to decision support systems (DSS). The authors arguethat effective management of DSS development requires: a) An explicit plan for the full development life cycle; b) Careful assignment of responsibility for DSS development; c) Appropriate user involvement and direction; and d) On-going user needs assessment andproblem diagnosis. A 13-stage tactical plan for DSS development, called the DSS development life cycle, is described. Results are presented froman in-depth survey of usersof 34 different DSS to show that the tasksperformed most in- effectively in DSS development are planning, assess- ment of user needs,andsystem evaluation. Results fromthe survey are also presented that show the fac- tors responsible for DSS projectapproval, and the fac- tors responsible for DSS success. Keywords:Decision support systems, development life cycle,architectural features, user need assessment, defining success ACM Categories:H.1.2, H.4.2, J.O Introduction Decision support systems (DSS) has become popular buzzword in the last few years. The term DSS applies to information systemsdesigned to help managers solve problems in relatively unstructured decision-making environments. The growing popularity of the DSS concept can be at- tributed, in part, to the increased com- petitiveness, uncertainty, and complexity of the economic environment. Successful DSS applica- tions have included long-range and strategic plan- ning, policy setting, new product planning, marketing planning, cash flow management, operational planning and budgeting, andportfolio management [3]. These systems usually in- tegrate sophisticated data management, model- ing, analytical, and dispiay capabilities with powerful, user friendly command languages. Managers now have a more broadly perceived need for decision support and, in many cases, a belief that computer systemscan be a valuable source of help [4]. In addition, the widespread and rapidly expanding diffusion of microcom- puters has changed the waymost managers think about computers in general. They are far more proactive in their interest and less resistant to the idea that they themselves can be "hands-on" users. They hear about software tools like VisiCaic (and the "Visi-clones") that make it easy for computer novices to begin building models andincorporating more quantitative analysis ap- proaches into their planning anddecision making. Many managers are curious about computerized decision support, yet these managers are proper- ly hesitant. They are uncertain about which criteria would help themto decide on whetherto proceed with a DSS,and what impact they can expect a DSS to have on the organization and on themselves. DSS development often requires an unpredictable investment in time and money, and managers cannot easily estimate what return to expect on this investment beyond somevague promiseof "better decisions." Theymayalso be unsure of how to initiate and develop a DSS or how to manage the development process as it evolves. Purpose and Approach The authors have conducted an in-depth survey of DSS users in a wide variety of firms and in- MIS Quarterly~June 1984 117

Upload: others

Post on 05-Oct-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Setting Priorities for DSS Development · 2016. 1. 13. · Keywords:Decision support systems, development life cycle, architectural features, user need assessment, defining success

DDS in Organizations

Setting Prioritiesfor DSS Development

By: C. Lawrence MeadorMartin J. GuyoteResearch & Planning, Inc.215 First StreetCambridge, Massachusetts 02142

By: Peter G.W. KeenInformational Technology

Services, Inc.One Forks RoadLexington, Massachusetts 02123

AbstractTraditional project management and design methodsused for data processing and MIS applications are ill-suited to decision support systems (DSS). The authorsargue that effective management of DSS developmentrequires:

a) An explicit plan for the full development life cycle;b) Careful assignment of responsibility for DSS

development;c) Appropriate user involvement and direction; andd) On-going user needs assessment and problem

diagnosis.A 13-stage tactical plan for DSS development, calledthe DSS development life cycle, is described. Resultsare presented from an in-depth survey of users of 34different DSS to show that the tasks performed most in-effectively in DSS development are planning, assess-ment of user needs, and system evaluation. Resultsfrom the survey are also presented that show the fac-tors responsible for DSS project approval, and the fac-tors responsible for DSS success.

Keywords:Decision support systems, developmentlife cycle, architectural features, user needassessment, defining success

ACM Categories: H.1.2, H.4.2, J.O

IntroductionDecision support systems (DSS) has become popular buzzword in the last few years. The termDSS applies to information systems designed tohelp managers solve problems in relativelyunstructured decision-making environments. Thegrowing popularity of the DSS concept can be at-tributed, in part, to the increased com-petitiveness, uncertainty, and complexity of theeconomic environment. Successful DSS applica-tions have included long-range and strategic plan-ning, policy setting, new product planning,marketing planning, cash flow management,operational planning and budgeting, and portfoliomanagement [3]. These systems usually in-tegrate sophisticated data management, model-ing, analytical, and dispiay capabilities withpowerful, user friendly command languages.

Managers now have a more broadly perceivedneed for decision support and, in many cases, abelief that computer systems can be a valuablesource of help [4]. In addition, the widespreadand rapidly expanding diffusion of microcom-puters has changed the way most managers thinkabout computers in general. They are far moreproactive in their interest and less resistant to theidea that they themselves can be "hands-on"users. They hear about software tools likeVisiCaic (and the "Visi-clones") that make it easyfor computer novices to begin building modelsand incorporating more quantitative analysis ap-proaches into their planning and decision making.Many managers are curious about computerizeddecision support, yet these managers are proper-ly hesitant. They are uncertain about whichcriteria would help them to decide on whether toproceed with a DSS, and what impact they canexpect a DSS to have on the organization and onthemselves. DSS development often requires anunpredictable investment in time and money, andmanagers cannot easily estimate what return toexpect on this investment beyond some vaguepromise of "better decisions." They may also beunsure of how to initiate and develop a DSS orhow to manage the development process as itevolves.

Purpose and ApproachThe authors have conducted an in-depth surveyof DSS users in a wide variety of firms and in-

MIS Quarterly~June 1984 117

Page 2: Setting Priorities for DSS Development · 2016. 1. 13. · Keywords:Decision support systems, development life cycle, architectural features, user need assessment, defining success

DSS in Organizations

dustries. The results (based on 19 organizationsand 34 DSS’s) offer some insights into managers’needs, how DSS’s are being developed and usedto address these needs, and the impacts of DSS.This article summarizes these results, and definesstrategies for managing DSS development anduse.

The likelihood of a DSS succeeding and havingthe desired organizational impact stronglydepends on understanding how to manage DSSdevelopment and use over time. Managerial deci-sion environments, and managers’ ways of think-ing about them, change frequently. To be useful aDSS must be responsive to these changes. Adistinctive feature of DSS development relative toother information systems is the use of an "adap-tive" design and development strategy [9]. Thiscalls for small-scale initial prototype systems,continued incremental development, and respon-siveness to users’ changing needs. Many articleson DSS design highlight aspects of this approach.Keen [9] calls it "adaptive design," Ness [10]terms it "middle-out" (versus top-down andbottom-up). Sprague and Carlson [11] providegeneral summaries and Bennett [12] describesspecific examples of user integration and adaptivemethods. The need for proactive DSS evolutionfollows naturally from the nature of the managerialtasks being supported. Ongoing management ofDSS development and use, instead of just installa-tion and exit, is necessary to address the need forresponsiveness over time. The end user surveydescribed here focuses on these issues.

End User Survey

The end user survey on which this article isbased has several purposes:

1. To provide a theoretical framework forsystematic investigation and discussion ofDSS end user design and developmentissues,

2. To develop an experimental methodologyfor gathering data relevant to these issues,and

3. To develop and test managerially usefuldiagnostic instruments and survey techni-ques for assessing critical DSS variables ina wide range of organizations.

The survey was conducted through administrationof a written questionnaire. An hour-long version ofthe questionnaire was used for in-depth personal

analysis of selected firms; questionnaires for theremaining firms were administered by mail andwere shorter, with a completion time of approx-imately twenty minutes. The questions coveredfour areas:

1. Organizational context, and individual at-titudes concerning computers, the orga-nization’s data processing department, andthe effectiveness of the DSS developmentprocess,

2. Structural characteristics of importantdecisions/tasks facing end users,

3. Characteristics of the decision processfor these tasks, and the impact of the DSSon the process, and

4. Characteristics of DSS implementationand use, plus end user assessments ofDSS architecture, capabilities, and suc-cess.

The emphasis on brevity and ease of completionled to a reliance on direct questions and the useof rating scales for responses.

The questionnaire was accompanied by a cover-sheet that described what was meant by DSS.Each respondent was asked to answer the surveyquestions with reference to a single, computer-based system. Users were asked to choose anintegrated system used to support decision mak-ing that offered end users the capability to modelsome important decisions, perform necessaryanalyses, manage the data involved, andgenerate reports and/or other visual displays. Byan "integrated" system we meant one that of-fered these capabilities under a single interface(even though several software tools may havebeen used to create the system). The primarytypes of decisions made with the DSS in thesurvey are shown in Figure 1. Although we didnot specifically distinguish between institutionaland ad hoc DSS (see Donovan & Madnick [13]),all but a few of the DSS were institutional in natureand offered "organizational support" [14] (i.e.,they were used to support recurring decisionssuch as budgeting and long-range planning and tocoordinate decisions across the organization).Respondents were asked to rate the DSS withregard to the capabilities embodied in the systemas presented to and used by them (which may ormay not have reflected the capabilities of theunderlying software tools).

The survey included widely different types ofsystems. Languages used to build the DSS in-

118 MIS Quarterly~June 1984

Page 3: Setting Priorities for DSS Development · 2016. 1. 13. · Keywords:Decision support systems, development life cycle, architectural features, user need assessment, defining success

DDS in Organizations

Type of Decision

General long-range planning ................

Strategic assessment .....................

Product strategy ........................

Negotiation of budgets ....................

Reporting and analysis ....................

Operational planning and control ..............

Acquisition strategy ......................

Capital investment strategy .................

Financing strategy .......................

General budgeting .......................

Cash flow management ....................

Percent of respondents whose primary useof the DSS is for this type of decision

38

23

8

8

8

6

2

2

2

2

2

Figure 1 Percent Distribution of Primary Type of Decision Made With DSS

cluded BASIC, FORTRAN, COBOL, PL/I andAPL, although most used specialized langugessuch as EXPRESS, IFPS, FOCUS, EPS, EM-PIRE, and XSIM.,T.here was little variation in thenature of the hardware supporting the systems,with very few systems not run on internal main-frames through the data processing department.

While two of the systems had been in use for sixyears, half of the systems had been in use fornine months or less. Frequency of use of thesystems varied from once a week or more (forabout half of the systems), to every two to threeweeks (about one out of four systems), to lessthan once every six months.

Respondent Characteristics

Questionnaires were sent to attendees of theDSS Software Conference held in Cambridge,Massachusetts, and to selected members of na-tional associations of strategic and financial plan-ners during 1982 and 1983. In addition, 50respondents at three test sites were chosen forin-depth analyses. These respondents werechosen in conjunction with high-level DSS usersand providers at each site. The total number ofquestionnaires distributed was 340. The totalnumber of questionnaires completed was 73 from

18 firms representing 34 different decision sup-port systems. This represents a response rate of22%.

Respondents in the end user survey came fromfirms with annual revenues ranging from a fewmillion to several billion dollars. The majority of therespondents were senior and middle managers,the remainder were staff planners and analysts.All were either direct or indirect end users of theDSS they rated. The greatest number were inmarketing departments, although financial andcorporate planning functions were also wellrepresented, as were MIS groups (Figure 2).

By and large, the respondents had extensive ex-perience with computers or computer systems. Itis not surprising, then, that they also heldfavorable attitudes toward computers.Respondents strongly disagreed with statementsthat computers were unlikeable or threatening,and strongly agreed that computers can increasethe effectiveness of senior managers.

The direct user of the systems varied con-siderably. In twelve cases, corporate or divisionmanagement used DSS directly. Nevertheless, inmost cases the direct users of the systems werestaff personnel, with top management using thesystem through intermediaries.

MIS Quarter/y/June 1984 119

Page 4: Setting Priorities for DSS Development · 2016. 1. 13. · Keywords:Decision support systems, development life cycle, architectural features, user need assessment, defining success

DDS in Organizations

OepartmentMarketing

Finance

Planning

Data Processing

Accounting

Staff Support

Product Development

Production

R&D ii4 6 8 10 12 14 16 18 20

Frequency22 24

Figure 2 Frequency Distribution of Respondent Department

Respondent Ratings of Data ProcessingWhen asked to rate the importance of variousskills for the data processing department, as wellas how they felt their in-house DP group perform-ed in these areas, the survey respondents ratedas most important: sensitivity to users’ needs,project management skills, and the motivation andeducation of users. These, and other "people"skills, were given priority over technical skills bythe respondents (Figure 3). In this, DSS endusers agreed with the information systems

managers and systems analysts surveyed byBenbasat [15]. In terms of performance, DPdepartments were rated most highly in their will-ingness to work closely with users in designingnew systems, and lowest in their knowledge ofuser department operations and their expertise indesigning analysis-based systems like D$S.

Respondent Ratings of DSS Featuresand EffectivenessRespondents were asked to give their ratings ofthe importance of specific DSS design features,

Skill

Sensitivity to users’ needs

Project management skills (planning and control)

Implementation planning: education, motivationand training of users

Expertise in design of analysis-based systems like DSS

Intimate knowledge of your department’s operations

Willingness to work closely with users indesigning new systems

Leadership ability, administrative experience,sensitivity to political issues

Average AverageRated Importance Rated Performance

6.23 4.43

5.64 4.42

5.49 4.24

5.47 3.57

5.21 3.82

5.09 5.84

4.78 4.03

Figure 3 Average Rated Importance and Performance of DP Department Skills (1 = Low, 7 = High)

120 MIS Quarterly~June 1984

Page 5: Setting Priorities for DSS Development · 2016. 1. 13. · Keywords:Decision support systems, development life cycle, architectural features, user need assessment, defining success

DDS in Organizations

as well as of the performance of their ownsystems with regard to these features. Theresults are shown in Figure 4; the rating scalesused for this question and all others in the surveyranged from 1 (low) to 7 (high). Respondentsagreed strongly about the importance of generalfeatures such as system adaptability, ease oflearning and use, and integration of all com-ponents under a single command language. Com-mon priorities concerning database charac-teristics stressed security and ability to supportlarge and complex databases with many dimen-sions and variables. Among the remainingfeatures, the highest priorities were sophisticatedgraphics and report formatting capabilities, andtools for performing "what-if" and sensitivityanalyses.

Respondents were evidently satisfied with manyof the database characteristics of their systems.The three features of security, database size, andcomplexity mentioned above were also amongthe highest rated features in terms of system per-formance. The DSS also received relatively highmarks for database expandability and display for-matting. The lowest marks were reserved for theavailable analytic and mathematical functions and,to a lesser extent, the modeling features of thesystems.

Respondents were asked to rate their agreementwith several statements indicating the "suc-cess" of their DSS. These statements included:

1. The DSS fits in well with our planningmethods.

2. The DSS fits in well with our reportingmethods.

3. The DSS fits in well with our way of thinkingabout problems.

4. The DSS has improved our way of thinkingabout problems.

5. The DSS fits in well with the "politics" of howdecisions are made around here.

6. Decisions reached with the aid of the DSSare usually implemented.

7. The DSS has resulted in substantial time sav-ings.

8. The DSS has been cost-effective.9. The DSS has been valuable relative to its

cost.10. The DSS will continue to be useful to our

organization for a number of years.11. The DSS has so far been a success.

The first six statements were meant to indicatehow effective the DSS had been in supportingorganizational decision making; the average ratedlevel of agreement with statements (1) to (6) 4.65. Respondents were therefore equivocal intheir assessment of the DSS’ impact on decisionmaking. Nevertheless, they were enthusiasticabout the cost-effectiveness and overall worth oftheir DSS. The last five statements received anaverage rating of 5.43. The highest intercorrela-tions were also found among the last fivestatements (the average correlation betweenpairs of ratings for statements (7) to (11) .70). Ratings for the first six statements werehighly intercorrelated, but correlated less stronglywith the statements regarding cost-effectivenessand success. These findings indicate that theperceived success of a DSS is dependent onmore than its "fit" with organizational decisionmaking; perceptions of overall cost-effectivenesspredominate.

Ratings of the importance of various factors in theDSS project approval process showed the mostimportant factor to be top management emphasisof the proposed system (Figure 5). The next mostimportant factor was the anticipated return on in-vestment in a cost/benefit sense; however, itseems that qualitative or "soft" benefits are notgiven a great deal of consideration.

Next in importance came a variety of practicalissues such as development costs, impacts ondata processing resources, and degree of usercommitment. Surprisingly, despite the political im-plications of some of these factors, "companypolitics" rated last in importance. This may reflectthe phrasing of the question; "company politics"is a more ambiguous and inclusive term than"degree of user commitment," "top managementemphasis," etc.

Respondent Ratings of DSS DevelopmentLife CycleTo be effective, the DSS design and developmentprocess should follow a plan -- "adaptive" doesnot mean ad hoc. A tactical plan is needed thatlists specific tasks to be performed, plus theirorder and frequency of performance. Since DSSevolve, that plan must be cyclical, that is, all ormost of the tasks will have to be redone at regularintervals to ensure responsiveness to changingusers’ needs. The survey asked users to evaluate

MIS Quarterly~June 1984 121

Page 6: Setting Priorities for DSS Development · 2016. 1. 13. · Keywords:Decision support systems, development life cycle, architectural features, user need assessment, defining success

DDS in Organizations

Figure 4 Average Rated Importance and Performance of DSS Features

The data structure allows for multiple sets of rows andcolumns (or dimensions and subscripts, e.g., sales dataarranged by month, geographic area, product, etc.) ....

The system supports a database large enough (in termsof the number and size of data entries) and complexenough (in terms of the number of rows, columns, ordimensions) to handle all users’ needs .............

The system is easily learned by new users, even thosewith limited computer experience .................

The contents of the database can be easily changed andexpanded .................................

The command language is natural to use and English-like

A security system exists that allows for restricted accessto the database using prespecified criteria (e.g., ID’s,passwords, etc.) ............................

The user has extensive control over display format forreports and graphs (e.g., labels, titles, column size,decimal placement, report size, headers, scaling, plotsymbols, etc.) ..............................

The system is integrated with all components runningunder a single command language ................

The system is adaptive, system components andlanguage can grow as users’ needs change .........

What if and sensitivity analysis functions, monte carlocapabilities .................................

There exists a database dictionary that catalogs allelements in the database (variable, dimension names,programs and functions, etc.) ...................

The system can accept and manipulate data created byother system, especially commonly used packages ....

Multiple database entries, each with different structure orcharacteristics, can exist simultaneously within thedatabase .................................

There is an interactive command or program with whichthe user can alter the contents of the database or enternew data .................................

An array of standard financial reports and exceptionreports can be generated automatically .............

Average RatedImportance

6.32

6.03

6.00

5.97

5.83

5.76

5.56

5.55

5.55

5.41

5.39

5.31

5.27

5.24

5.21

Average RatedPerformance

5.28

5.08

4.57

4.91

4.77

5.51

4.80

4.72

4.72

4.42

4.58

3.58

3.87

4.77

4.45

122 MIS Quarterly~June 1984

Page 7: Setting Priorities for DSS Development · 2016. 1. 13. · Keywords:Decision support systems, development life cycle, architectural features, user need assessment, defining success

DDS in Organizations

Users can easily modify existing programs and functions,and write their own ..........................

Data for reports and tables can be drawn from manysources within the database simultaneously ..........

A wide array of graph formats is supported (line, bar,histogram, pie, stock market, multiple plots) .........

The system is flexible; users from different functionalareas with problems of different levels of complexity canall use it ..................................

Trend projection/curve fitting functions, and time-seriesanalyses ..................................

Two or more databases can be conceptually merged orhierarchically related ..........................

The system format displays tables automatically for multi-dimensional data structures .....................

The format in which data is entered can be flexible and isnot strictly regimented ........................

A wide range of statistical procedures is available(descriptive statistics, regression, other multivariatetechniques, scaling techniques, non-parametric, log-linearmethods) .................................

Multiple models can exist within a database ..........

Goal seeking, or backward iteration, automatic reorderingof systems of equations, and solution of systems ofsimultaneous equations ........................

Time-series data can be mixed with non-temporal data . .

Standard financial functions (e.g., IRR, NPV, deprecia-tion, discounted cash flow, capital asset pricing)

Risk analysis with wide range of distributions .........

The command language can be tailored to meet individualrequirements and tastes .......................

Ad-hoc calculator mode with all algebraic operators ....

Econometric modeling facilities with a wide range ofparameter estimation techniques .................

Optimization functions (e.g., linear, integer, mixed in-teger, nonlinear, dynamic, and goal programming) .....

Average Rated Average RatedImportance Performance

5.16 4.20

5.15 4.52

5.14 3.75

5.07 4.35

5.03 4.00

5.00 4.06

5.00 4.80

4.97 3.52

4.97 3.81

4.94 4.35

4.81 4.09

4.72 3.24

4.71 4.23

4.46 3.65

4.43 4.06

4.40 3.96

4.35 2.76

3.91 3.26

MIS Quarterly~June 1984 123

Page 8: Setting Priorities for DSS Development · 2016. 1. 13. · Keywords:Decision support systems, development life cycle, architectural features, user need assessment, defining success

DDS in Organizations

Factor

Top management emphasis

Return on investment(cost/benefit) .........

Technically do-able .....

DSS development costs . .

Impact on data processingresources ............

Degree of user cdmmitment

Increase in usereffectiveness .........

DSS operating costs .....

Increase in user efficiency.

Adaptability of organizationto change ............

Urgency of user needs . . .

Uncertainty of objectivesfor DSS design ........

Qualitative or "soft"benefits .............

Company politics .......

Average RatedImportance

5.91

5.04

4.87

4.76

4.70

4.70

4.67

4.64

4.61

4.52

4.49

4.27

4.11

4.09

(1 = Low, 7 = High)

Figure 5

Average Rated Importance of Factorsin DSS Project Approval Process

the importance and the performance in theirorganization of each of the various stages of DSSdevelopment. We call the tactical plan used in thisresearch the DSS development life cycle (Figure6). The tasks are described below:

1. Planning -- User needs assessment andproblem diagnosis.

2. Application research -- Identification ofrelevant fundamental approaches for ad-dressing user needs and availableresources (vendors, systems, studies ofrelated experiences in other organiza-tions, and review of relevant research).

3. Analysis -- Determination of best ap-proach and specific resources required toimplement it, including technical, staff,financial, and organizational resources.

4. Design -- Detailed specifications ofsystem components, structure, andfeatures.

5. System construction -- Technical im-plementation of the design.

6. System testing -- Collection of data onsystem performance to determinewhether the system performs in accor-dance with design specifications.

7. Evaluation -- Determination of how wellthe implemented system satisfies users’needs and identification of technical andorganizational loose-ends.

8. Demonstration -- Demonstration of thefully-operational system capabilities to theuser community:

9. Orientation --Instruction of top-levelmanagerial users in the basic capabilitiesof the system.

10. Training -- Training of direct users insystem structure and operation.

11. Deployment -- Operational deploymentof the full system capability for allmembers of the user community.

12. Maintenance ~ Ongoing support of thesystem and its user community.

13. Adaptation -- Planned periodic recyclingthrough the above tasks to respond tochanging user needs.

In assessing the importance of DSS developmentstages, users rated the planning stage (userneeds assessment and problem diagnosis) as thehighest priority: and the lowest priorities involvedresearch, detailed specifications, and systemconstruction (Figure 7). On the other hand, it wasthe more technical, typically DP-oriented activitiesthat were given the highest marks for perfor-mance. The most poorly performed activitieswere evaluation and orientation of top-levelmanagers in the basic capabilities of the system.

We measured the effectiveness with which eachtask was performed for each DSS in the survey.Given that the organization has limited resourcesto devote to developing a DSS, there should be adirect relationship between importance and per-~o’rmance -- the more important a particular task,the better it should be performed. Obviously, it

124 MIS Quarterly~June 1984

Page 9: Setting Priorities for DSS Development · 2016. 1. 13. · Keywords:Decision support systems, development life cycle, architectural features, user need assessment, defining success

DDS in Organizations

abbreviated /development ~cycles for <~. "

simple ad hoc ~DSS ~

Planning~ ~Research

Analysis

v Design

~ SystemConstruction

Testing

Evaluation

Demonstration

Orientation

~ Training

~ Deployment

Maintenance

Adaptation

IDSS Development Life Cycle

feasibilityassessment& preliminaryvalue analysis

prototyping& secondaryvalue analysis

Figure 6 Alternate Paths for DSS Development

makes little sense to commit time and resourcesto a task if its performance has little bearing onsuccess.

Of course, the results varied somewhat for dif-ferent systems, but the overall pattern is shown inFigure 8. This figure shows the average rated im-portance and performance of each task in thedevelopment life cycle. Planning, evaluation, andorientation of top-level users were all activitiesthat were ineffectively performed. These tasks allfell more than one standard error of estimatebelow the regression line. (The standard error ofestimate indicates the standard deviation of thepoints from the regression line.) Note thatalthough planning did not receive the lowest ab-solute average performance rating, Figure 8 sug-

gests that it was the least effectively performedtask. The mediocre performance of planningshould not be tolerated, given its distinctive im-portance. The more technical, DP-oriented tasksof design and construction were also ineffectivelyperformed. These were the best performed taskseven though rated least important. This meansthat resources were committed to these tasksthat might have been better spent on the more im-portant tasks of planning and evaluation. Thisresult may reflect an avoidance of the more ill-defined tasks in system development. The lessonhere is that some assessment of the importanceof these tasks should be a part of planning, andDSS development resources should be allocatedaccordingly. In addition, it seems that technicalstaff experienced in traditional software develop-

MIS Quarterly/June 1984 125

Page 10: Setting Priorities for DSS Development · 2016. 1. 13. · Keywords:Decision support systems, development life cycle, architectural features, user need assessment, defining success

DDS in Organizations

DSS Planning: User needs assessment and problemdiagnosis .................................

Adaptation: Responsiveness to new user needs as wellas to organization and environmental changes ........

Training: Training of direct system users in systemstructure and operation ........................

Maintenance: Ongoing support of the system and itsuser community .............................

Evaluation: Determination of how well implementedsystem satisfies users’ needs ...................

System Testing: Collection of data on systemperformance to determine whether system performs inaccordance with design specifications .............

Analysis: Determination of best approach and specificresources required to implement it ................

Orientation: Instruction of top-level managerial users inthe basic capabilities of the system ...............

Deployment: Operational deployment of the full systemcapability for all members of the user community ......

Construction: Technical implementation of the design .

Demonstration: Demonstration of fully-operationalsystem capabilities to members of the user community.

Application Research: Identification of relevantfundamental approaches for addressing user needs, andavailable resources (including systems, vendors, etc.)

Design: Detailed specification of system components,structure, and features ........................

Average RatedImportance

Average RatedPerformance

6.09 4.49

5.74 4.54

5.72 4.38

5.72 4.76

5.64 4.18

5.60 4.54

5.51 4.74

5.47 4.00

5.37 4.43

5.26 5.11

5.24 4.42

5.21 4.46

5.16 4.63

(1 --Low, 7=High)

Figure 7 Average Rated Importance and Performance of DSS Development Stages

ment methods may have trouble adjusting to themore unstructured environment of DSS. Data pro-cessing begins with functional specifications,DSS with contextual analysis.

ConclusionsDespite the diversity of the sample, it is a smallone, and the data presented here are based onrespondents’ impressions. It seems clear that in

Order to satisfactorily examine these ideas, in-depth longitudinal studies are needed in additionto cross-sectional ones of this type. Having saidthis however, we must add that some interestingpatterns have emerged.

In interpreting the results, it is important to keep inmind the overwhelmingly favorable attitudesamong the respondents toward computers andDSS. The accompanying success of the DSS inthe survey could have been either a cause or a

126 MIS Quarterly~June 1984

Page 11: Setting Priorities for DSS Development · 2016. 1. 13. · Keywords:Decision support systems, development life cycle, architectural features, user need assessment, defining success

DDS in Organizations

5.25

5.00

4.75

4.50

4.25

4.00

3.754.75 5.00 5.25 5.50 5.75

Average Importance

Slope of regression line = .82Standard error of estimate = .41

~ ¯ Construct

...~I.." ..,.t.,.onco/" I .

I ... /I~ .~’ - - ¯ Adaptation

’" /

6.00 6.25

Figure 8 Average Performance vs. Average Importance for DSS Development Process

result of these attitudes. Given that most of theDSS were primarily used for organizational deci-sions involving more than a few people, it was notsurprising to find high importance ratings forsystem adaptability, ease-of-use, databasesecurity, multidimensionality, and sophisticateddisplay capabilities to support communication ofresults.

Another important finding was the significance ofcost-effectiveness as a factor in DSS project ap-proval and perceived DSS success. Althoughmanagers may desire more effective decisionmaking, the ability of a DSS to effect more effi-cient decision making is also important to creatingand maintaining managerial support. This con-trasts with case studies of early DSS [16] whichshowed an emphasis on value rather than cost,and a general disregard of traditional cost-benefitanalysis. The importance of time-savings in for-mulating plans and budgets shows up often. This

is particularly Understandable for DSS designedto support regularly recurring institutional deci-sions like the ones in this survey.

In developing a DSS, the least effectively per-formed tasks were seen to be planning, evalua-tion, and orientation. The first two tasks are par-ticularly important and difficult because they re-quire the DSS developer to assess users’ needsand how well they are being (or could be) met a DSS, and what the DSS should look like to/servethose needs. The crucial role played during the in-itial stages of application development have beenborne out by research in MIS and EDP applications. For example, McKeen [17] surveyed 32business applications systems and found thatgreater time and effort spent on front-end analysisresulted in less overall system development time,less overall cost, and greater user satisfactionwith the delivered system.

MIS Quarterly~June 1984 127

Page 12: Setting Priorities for DSS Development · 2016. 1. 13. · Keywords:Decision support systems, development life cycle, architectural features, user need assessment, defining success

DDS in Organization~

Taken together, the survey results pose a dilem-ma familiar to developers of DSS; the problem ofassessment. Before approving large scale DSSprojects most managers need evidence of someconcrete benefits to offset development costs.Evaluations of the success of existing systemsalso depends on assessing benefits. User needsassessment is critical in providing continuingdirection to DSS development. But when DSS’ssupport ill-defined problems their impact onorganizational problem-solving is likely to be ill-defined. Successful introduction of DSS into anorganization may then depend on targeting ap-plications where substantial time cost benefitscan be demonstrated (e.g. automating manualbudgeting) until such time as users anddevelopers can form more educated opinionsabout what DSS will do for them.

References

[1] Keen, P.G.W., and Scott Morton, M.S.Decision Support Systems; An Organiza-tional Perspective, Addison-Wesley,Reading, Massachusetts, 1978.

[2] Meador, C.L., and Ness, D.N. "DecisionSupport Systems: An Application to Cor-porate Planning," Sloan ManagementReview, Volume 14, Number 2, pp. 51-68.

[3] Alter, S.L. Decision Support Systems: Cur-rent Practice and Continuing Challenge,Addison-Wesley, Reading, Massachusetts,1980.

[4] Scott Morton, M.S. "Organizing the Infor-mation Function for Effectiveness as Wellas Efficiency," Working Paper NumberC15R-17, Massachusetts Institute ofTechnology, Cambridge, Massachusetts,1975.

[5] Rockart, J.F., and Treacy, M.E. "ExecutiveInformation Support Systems," WorkingPaper Number C15R-65, MassachusettsInstitute of Technology, Cambridge,Massachusetts, 1975.

[6] AIIoway, R.M "Defining Success for DataProcessing," Working Paper NumberC15R-56, Massachusetts Institute ofTechnology, Cambridge, Massachusetts,1980.

[7] Rockart, J.F., and Bullen, C. "A Primer onCritical Success Factors," Working PaperNumber C15R-69, Massachusetts Institute

of Technology, Cambridge, Massa-chusetts, 1981.

[8] AIIoway, R.M. "User Managers’ SystemsNeeds," Working Paper Number C15R-56,Massachusetts Institute of Technology,Cambridge, Massachusetts, 1980.

[9] Keen, P.G.W. "DSS: A Research Perspec-tive," Working Paper Number C15R-54,Massachusetts Institute of Technology,Cambridge, Massachusetts, 1980.

[10] Ness, D.N. "DSS: Theories of Design,"Paper presented at the Wharton Office ofNaval Research Conference on DSS,University of Pennsylvania, Philadelphia,Pennsylvania, November 4-7, 1975.

[11] Sprague, Jr., R.H., and Carlson, E.D.Building Effective Decision SupportSystems. Prentice-Hall, Inc., EnglewoodCliffs, New Jersey, 1982.

[12] Bennett, J.L. "Integrating Users and Deci-sion Support Systems," in A Proceedingsof the Sixth and Seventh Annual Con-ferences of the Society for Management In-formation Systems, J.D. White (ed.) Univer-sity of Michigan Press, Ann Arbor,Michigan, 1976.

[13] Donovan, J.J., and Madnick, S.E. Institu-tional and Ad-Hoc Decision SupportSystems and Their Effective Use, MITWorking Paper C15R-27, Cambridge,Massachusetts, 1977.

[14] Keen, P.G.W., and Hackathorn, R.D."Decision Support Systems and PersonalComputing," MIS Quarterly, Volume 5,Number 1, March 1981, pp. 21-27.

[15] Benbasat, I., Dexter, A.S., and Mantha,R.W. "Impact of Organizational Maturity onInformation Systems Skill Needs," MISQuarterly, Volume 4, Number 1, March1980, pp. 21-34.

[16] Keen, P.G.W. "Value Analysis: JustifyingDSS," MIS Quarterly, Volume 5, Number 1,March 1981, pp. 1-16.

[17] McKeen, J.D. "Successful DevelopmentStrategies for Business ApplicationsSystems," MIS Quarterly, Volume 7,Number 3, September 1983, pp. 47-59.

128 MIS Quarterly/June 1984

Page 13: Setting Priorities for DSS Development · 2016. 1. 13. · Keywords:Decision support systems, development life cycle, architectural features, user need assessment, defining success

DDS in Organizations

About the Authors

C. Lawrence Meador is a Lecturer in the Schoolof Engineering, MIT, and is Executive Vice Presi-dent of Research & Planning, Inc., a consultingfirm in Cambridge, Massachusetts. He receivedhis graduate degrees from the Sloan School ofManagement and the School of Engineering, MIT.He serves on the Board of Directors of AccessData Services, Inc. His current research is con-cerned with DSS end user needs assessment andsoftware technology feature evaluation. Heserves as an editor of Computer Communicationsand Communicacion e Informatica.

Martin J. Guyote received the M.S. and Ph.Ddegrees in cognitive psychology at Yale Universi-ty, and the M.S. degree in management at MIT.

He has been on the faculty of Boston University.He is currently a Consultant with Research &Planning, Inc. in the areas of decision supportsystems, human problem solving and reasoning,and research methodology.

Peter G, W. Keen, Visiting Senior Lecturer of theLondon Business School, received his DBA fromHarvard University and has been on the facultiesof the Sloan School of Management, MIT, HarvardBusiness School, Stanford University, and theWharton School at the University of Pennsylvania.His current research includes a study of supportsystems for policy analysts in educational financeand methodologies for evaluating computersystems which provide for qualitative benefits. Heis the managing editor of Office: Technology andPeople.

MIS Quarterly/June 1984 129