supercomputing in corporate america: a sample survey

13
Information & Management 24 (1993) 291-303 North-Holland 291 Research Supercomputing in corporate America * A sample survey Mohammad M. Amini Memphrs State University, Memphis, TN, USA Robert E. Schooley Ernst and Young, Cleueland, OH, USA Mohammad M. Amini is an assistant professor of management information systems and decision sciences within the Fogelman College of Business and Economics at Memphis State Univer- sity. He also held faculty position at Southern Methodist University. His research interests include parallel/ vector computing, commercial super- computing, and mathematical pro- gramming. He received his B.A. in Business Administration from Tehran University, an MBA in Production Operation from University of North Texas, an MS and Ph.D. in Operations Research from Southern Methodist University. He is a member of ORSA, TIMS, DSI, and BE. Roberr E. Schooley is a manager in Ernst & Young National Office in Cleveland, Ohio. Working within the Professional and Organization Devel- opment department, his primary re- sponsibility is the development of ed- ucational products and services de- signed to enhance the performance of management consultants. Before join- ing Ernst & Young he was a college professor of management information systems at Oklahoma State Univer- sity, San Diego State University, and Memphis State University. He earned his doctorate degree from Oklahoma State University and has a master degree in Human Resource Development. Correspondence to: Management Information Systems and Decision Sciences Department, The Fogelman College of Business and Economics, Memphis State University Memphis, TN 38152; (901) 678-2479; BITNET: aminim@memstvxl; IN- TERNET: [email protected]. * This research project was partially supported by a Fogel- man Research Grant from Memphis State University. An information technology forecasts that supercomputing will have a significant impact on the data processing opera- tions of corporations and be a competitive necessity in the 1990s. This study attempts to measure the degree of aware- ness, ability to use, adoption, and intention of corporate America to apply this new technology to improve their pro- ductivity and profitability, as well as their competitive edge in the global economy of the 1990s. We present the results of a study of 201 corporations in eighteen industries concerning their current and expected future use of supercomputers in mainstream business applications. Specifically, our purposes were to determine existing commercial supercomputing and report on the uses, awareness of the business potential of supercomputing, ability to “work with” supercomputers, uti- lization of commercial supercomputing, and perceived current and future supercomputer applications by competing firms. Keywords: Supercomputer; Supercomputer applications; Su- percomputing in U.S. corporations; Commercial supercomput- ing 1. Introduction The landmark MIT study, Management in the 1990s, identified four highly interrelated business forces that will shape the destiny of many organi- zations in the 1990s: globalization of business operations, worldwide competition, productivity enhancement, and the increasing volatility of the business environment. They will all significantly impact organizational missions, structures, and operating practices in the next decade [61]. Competing organizations will also be impacted by advances in key components of information technology (IT) architecture. Workstations, shared-access distributed databases and knowl- edge bases, communication networks, and spe- cialized processors (i.e. supercomputers) will be- gin to merge on the corporate IT scene, due to their predicted continual and dramatic reduction in cost [58]. 0378.7206/93/$06.00 0 1993 - Elsevier Science Publishers B.V. All rights reserved

Upload: mohammad-m-amini

Post on 21-Jun-2016

221 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Supercomputing in corporate America: A sample survey

Information & Management 24 (1993) 291-303

North-Holland

291

Research

Supercomputing in corporate America *

A sample survey

Mohammad M. Amini Memphrs State University, Memphis, TN, USA

Robert E. Schooley Ernst and Young, Cleueland, OH, USA

Mohammad M. Amini is an assistant professor of management information systems and decision sciences within the Fogelman College of Business and Economics at Memphis State Univer- sity. He also held faculty position at Southern Methodist University. His research interests include parallel/ vector computing, commercial super- computing, and mathematical pro- gramming. He received his B.A. in Business Administration from Tehran University, an MBA in Production

Operation from University of North Texas, an MS and Ph.D. in Operations Research from Southern Methodist University. He is a member of ORSA, TIMS, DSI, and BE.

Roberr E. Schooley is a manager in Ernst & Young National Office in Cleveland, Ohio. Working within the Professional and Organization Devel- opment department, his primary re- sponsibility is the development of ed- ucational products and services de- signed to enhance the performance of management consultants. Before join- ing Ernst & Young he was a college professor of management information systems at Oklahoma State Univer- sity, San Diego State University, and

Memphis State University. He earned his doctorate degree from Oklahoma State University and has a master degree in Human Resource Development. Correspondence to: Management Information Systems and

Decision Sciences Department, The Fogelman College of Business and Economics, Memphis State University Memphis, TN 38152; (901) 678-2479; BITNET: aminim@memstvxl; IN-

TERNET: [email protected]. * This research project was partially supported by a Fogel-

man Research Grant from Memphis State University.

An information technology forecasts that supercomputing

will have a significant impact on the data processing opera-

tions of corporations and be a competitive necessity in the

1990s. This study attempts to measure the degree of aware-

ness, ability to use, adoption, and intention of corporate

America to apply this new technology to improve their pro-

ductivity and profitability, as well as their competitive edge in

the global economy of the 1990s. We present the results of a

study of 201 corporations in eighteen industries concerning

their current and expected future use of supercomputers in

mainstream business applications. Specifically, our purposes

were to determine existing commercial supercomputing and

report on the uses, awareness of the business potential of

supercomputing, ability to “work with” supercomputers, uti-

lization of commercial supercomputing, and perceived current

and future supercomputer applications by competing firms.

Keywords: Supercomputer; Supercomputer applications; Su-

percomputing in U.S. corporations; Commercial supercomput-

ing

1. Introduction

The landmark MIT study, Management in the 1990s, identified four highly interrelated business forces that will shape the destiny of many organi- zations in the 1990s: globalization of business operations, worldwide competition, productivity enhancement, and the increasing volatility of the business environment. They will all significantly impact organizational missions, structures, and operating practices in the next decade [61].

Competing organizations will also be impacted by advances in key components of information technology (IT) architecture. Workstations, shared-access distributed databases and knowl- edge bases, communication networks, and spe- cialized processors (i.e. supercomputers) will be- gin to merge on the corporate IT scene, due to their predicted continual and dramatic reduction in cost [58].

0378.7206/93/$06.00 0 1993 - Elsevier Science Publishers B.V. All rights reserved

Page 2: Supercomputing in corporate America: A sample survey

292 Research Information & Management

Also, the dominate strategy theme of the 1990s will be “the recognition and exploitation of IT capabilities for fundamental strategic choices of business scope, governance mechanisms, organi- zational reconfiguration, and competitive actions in the marketplace” [85]. One of the major impli- cations is that the use of information technology will be a competitive necessity for businesses and governments in the highly volatile and unpre- dictable global economy of the 1990s.

It has been predicted that mainstream com- puter applications will become more and more sophisticated, progressing from data and informa- tion processing to knowledge and intelligence processing [45]. Each level of sophistication de- mands more powerful computers, more advanced processing methodologies, and more sophisti- cated software. Thus, the degree of complexity in IT varies in the extent to which data is being processed and the application of more complex hardware, software, and methodologies.

In order to adopt a leading-edge technology, an organization must have a fundamental under- standing of it. “What is a supercomputer?” has been a controversial question in business [10,30, 39,45,47,73,86]. A simple definition is: “A super- computer is the fastest available computer.” A definition for the business community is: “Quali- tatively, supercomputers are those machines at the leading edge of technology in terms of com- puting speed and the overall ability to perform many computing jobs very fast” [21]. For practical purposes, this definition is adequate for our dis- cussion.

This paper presents the results of a survey of 201 American corporations on their use of super- computers in mainstream commercial applica- tions. First, we discuss some of the business su- percomputing applications, and next, the out- comes of our survey are presented.

2. Commercial supercomputing

Supercomputers have expanded the realm of solvable problems in many areas of science and engineering including: modeling complex phe- nomena, testing, interpreting test results, and simulating physical processes in timely manner [ 15,331, weather forecasting [ 17,791, automobile and aircraft design [4,29,43,48,60,83], image and

sound processing [3,14,77,81], seismic exploration and reservoir modeling [49,54,57,80], artificial in- telligence and neural networks [37,38,46,84], biomedical research [9,42,55], and computational chemistry and material science [36,65,76], etc. In general, supercomputers are used as experimen- tal laboratories to advance the state of the art in any quantifiable discipline and as tools to mini- mize complexity where underlying concepts are either poorly understood or too massive to be appreciated by a human mind [75].

The decision making demands in today’s volatile and competitive business world have be- gun to exert similar processing pressures. A sur- vey of supercomputer vendors reveals that there is growing interest in supercomputers for busi- nesses, but the number of units installed for business applications has been negligible [66]. Meanwhile, the attraction to this technology among venture capitalists has been very strong and an increasing number of companies have been entering the business of designing, manufac- turing, and marketing supercomputers [56,88]. By early 1988, thirty one companies had introduced more than 60 different supercomputers, and ex- pecting an annual growth rate of 40% [59].

There is evidence that supercomputers are slowly migrating to corporate data processing centers [41]. However, several major barriers must be overcome before commercial use of this tech- nology becomes widespread. The roadblocks in- clude: MIS executives do not know how to use the technology, the end-users do not appreciate the qualitative improvements [90], they lack appli- cation software and COBOL compilers to mi- grate the vast number of programs in business applications [681, a shortage of well-trained appli- cation developers [74], and the lack of interdisci- plinary work to making optimum use of the tech- nology [78]. From a long-term perspective, a criti- cal obstacle that confronts all organizations in adopting any new information technology is that companies try to overlay new technologies on old structures, processes, and work roles [511. With- out renovation of the corporate mind-set, tech- nology may not be able to satisfy corporate the required price-performance rates.

Compute intensive problems are the prime target of supercomputer applications. So far, the prime applications are in system optimization and simulation [20]. Specifically, supercomputers

Page 3: Supercomputing in corporate America: A sample survey

Information & Management M.M. Amini, R.E. Schooley / Supercomputing 293

quickly solve models in the domain of operations research/ management science. A partial treat- ment of such compute intensive applications in [8,91] and a comprehensive survey in [l] indicate that linear programming, integer programming, nonlinear programming, dynamic programming, network optimization, stochastic programming, simulation, and neural networks can be more efficiently solved by the use of supercomputers. These models have commercial application. But, as is generally the case, the work is driven by people with inherent interest in the technology.

Major commercial applications in which quali- tative and quantitative improvements have been observed are as follows. Large-scale database management, where retrieval and update of data is time consuming [5,19,66,69,70,87]. Editing and translating data input into meaningful knowledge, such as finding complex patterns among millions of data records, is known as data filtering and demands the power of supercomputer [88].

In the competitive airline industry, where the profit margin is narrow and survival is often at stake, scheduling of crew and aircraft, mainte- nance operations, and rescheduling due to delays and weather disruptions give rise to a complex problem. In order to satisfy the multitude of constraints, supercomputers are used to generate “real-time” schedules [ 16,50,70]. There is some evidence that the application of supercomputers for scheduling and controlling large-scale auto- mated manufacturing operations where hundreds of machines are operating simultaneously [121.

Scanner systems at grocery stores read bar codes to collect information about products and customer behavior, and measure virtually all the marketing variables along the pipeline [40]. As the result, it is possible to gather a valid national sample of 2,500 or more supermarkets for mar- keting analysis. Parallel supercomputers are used to identify trends in the constantly shifting vol- ume of sales data to determine how a product faring in one state or city comparing to the oth- ers.

Forecasting the state of the economy at local, national, or international levels is a complex task that requires solution to large-scale econometric or computer simulation models. To enhance the accuracy and computational speed of predicting economic growth, these models run on supercom- puters [35,72]. Also, the compilation of input/

output tables for goods and services in a national economy is another example of supercomputing in economic forecasting. A table for the United States economy takes more than one day on a conventional mainframe. Using a supercomputer it could be completed in fifteen minutes.

In the past few years, supercomputing has found a new arena: competitive financial market. Wall Street firms are using supercomputers to solve compute-intensive financial problems faster and with more accuracy. Dow Jones and Co., via its two massively parallel computers, offers the Professional Investor’s Report, which follows up the movements of about 6,000 stocks in search of any abnormal activities in quasi-real-time. Eso- teric trading strategies with complex new securi- ties, ever increasing number of traders, world- wide “real-time” trading on a 24-hour basis and highly competitive nature of the securities market has called for tremendous computing power at Prudential-Bathe Securities [2]. The firm runs three major compute-intensive applications on supercomputers: an Adjustable Rate Mortgage (ARM) pricing model, an Arbitrage-Free Pricing System CAPS), and a Mortgage Pass-Through Op- tion Adjusted Spread (OAS) model. These mod- els run on the their parallel computer by an average of 4 to 8 times faster than on the firm’s conventional mainframe [52].

We now expand on the need for a supercom- puter to analyze “real-time,” compute-intensive, and complex commercial applications by a de- tailed discussion of two popular and well docu- mented cases in the area of planning under un- certainty.

2.1. Mortgage-backed securities (MBS)

A major advancement in the financial decision making process has been the utilization of com- plex mathematical modeling techniques. The most critical one is the inclusion of uncertainty in single- or multi-period financial planning. Refor- mulation of deterministic planning models under uncertainty has been relatively easy and was pro- posed in the early 50s’ [7,23]. But, the model size for a practical application becomes enormous, and hence the solution intractable. The challenge has been three fold: including more realism in models [89], devising efficient solution method-

Page 4: Supercomputing in corporate America: A sample survey

294 Research

ologies [62,92], and using more powerful comput- ers to obtain accurate and “real-time” solutions. “The tradeoff between the achievable degree of realism and the computational complexity of mathematical program [631” has been the core of this challenge.

A variety of mathematical models have, how- ever, become popular for many applications in financial planning. Many studies have focussed on the theoretical aspects and descriptions of financial problems [64,82,941. A survey conducted in 1980 showed that there were applications of many mathematical models in the banking indus- try [181; alternative modeling strategies also have been devised 153,711. Also, commercialization of supercomputers has opened new opportunities. As an example, we concentrate on simulation involving Mortgage-Backed Securities (MBS).

In December 1987, about 31% of the U.S. capital market ($9.381 trillion) was in the out- standing mortgages. The residential mortgages reached about $2 trillion by mid-year 1988, and almost half of this $708 billion, was securitized. Fifty percents of this securitized market is backed by government-insured mortgages, and the re- maining 50% is backed by conventional mort- gages or is privately issued [6]. The mortgage market consists of a rapidly evolving and ever growing number of mortgage securities products that now constitute the largest and most complex aspect of the fixed-income securities market.

With an MBS, investors own an undivided interest in the pool of mortgages that collateral- ize the security. The cashflow generated by prin- cipal and interest is passed through to the in- vestor as it is generated by the mortgages within the pool. Each pool has a coupon, or pass-through rate, an issue date, a final, or stated, a maturity date, an average life, and a payment delay. MBSs differ from all other fixed-income securities in two respects. First, it is made up of a series of cashflows consisting of the monthly principal and interest, as compared to a typical corporate or government bond with interest usually paid as a semiannual coupon with principal paid on matu- rity. Second, the cash flows vary in amount each month, because they depend on the principal and interest payments for the mortgages. In addition, each person borrowing cash has an implied call option; i.e., the right of the homeowner to prepay the mortgage at any time. This option may be

Information & Management

exercised due to a low-interest-rate market and the ability to refinance.

Uncertain future in the interest rates and exer- cising of the right of call option are the major factors in determining the cashflow generated by a group of securities. After buying mortgages from lending institutions, major decisions for se- curities firms, under uncertainty, are (1) how to estimate the value of the mortgages, (2) how to package them, and (3) how much interest the fixed-income securities will pay. The valuation and packaging process is compute-intensive and it has been recognized to need parallel supercom- puters [93]. The general steps in the valuation and packaging are [44]:

Step I:

Step II:

Step III:

Generate arbitrage free interest rate scenarios consistent with the prevailing term structure of the interest rates. Generate cashflows for each interest rate scenario. Use the cashflow and the short term interest rates along each path to com- pute expected net present value of the cashflows. This can be used to generate a “fair” price for the security or to compute an option adjusted spread over the yield curve.

The first two results are determined by apply- ing Monte Carlo simulation to interest rate paths and, based on these and some prepayment mod- els for the MBS, the security’s cashflows are found. Next the associated net present values are calculated. And finally, stochastic network opti- mization techniques are utilized to package secu- rities so that the spread is maximized.

Generating the interest rate paths is compute intensive. The two popular methods are based on Monte Carlo simulation of a diffusion process, and the sampling of a binomial lattice model of interest rates. Assuming a lognormal distribution with “mean reversion” modifications, Monte Carlo simulation generates a large number of interest rate paths. Applying the binomial lattice model, a large number of interest rate scenarios are computed along sampling paths from the binomial lattice. In this, discrete points in time are marked on the horizontal axis, and nodes of the lattice represent possible interest rates at every point. The term state change can be either “up” or “down.”

Page 5: Supercomputing in corporate America: A sample survey

Information & Management M.M. Amini. R.E. Schooley / Supercomputing 295

To understand the computational complexi- ties, assume a binomial lattice of rates over a 30 year horizon in monthly intervals (i.e. 360 possi- ble states) and a formidable 2360 possible paths. Even using the best sampling methods, a sample of several hundred paths, typically around 1000, has to be studied in order to obtain pricing esti- mates within acceptable error bounds.

Parallel implementation of the MBS simula- tion on a massively parallel Connection Machine was integrated with the parallel algorithms for solving multi-period stochastic programming models to determine a “fair” price for the secu- rity or spread on a 30 year horizon. A computa- tional experiment with the new hybrid approach in evaluating option adjusted spreads and in gen- erating cashflow scenarios for a portfolio of 3,000 MBS on a Connection Machine CM-2 with up to 32,000 processors showed that the largest sce- nario of 2,048 was solved to a high tolerance of error in less than 3 minutes, while the same analysis would take days on a workstation and few hours on a Cray supercomputer.

Since the opportunities in the securities mar- kets last only as long as the time it takes for the first person to take advantage of it, the competi- tion has placed a premium on speed of computa- tion (and thus getting to the market first [2]).

2,2. Electrical power planning

Multiperiod planning under uncertainty is also useful in the area of facility or resource planning. A typical application is the building of power stations and transmission of lines. The problem may be posed as [22]: “Given a large multi-area electric power system, determine a mix of various types of generation and transmission capacities for the planning horizon that will meet future demand and reliability requirements and such that the total discounted capital and operating costs are mini- mized.” The system may include interconnected utilities in single- and multiple-areas and a plan- ning horizon of 30 to 40 years to account for the impact of installation of new technologies.

The multi-periodicity and uncertainty involved in the analysis of such a system give rise to a large-scale complex multiperiod stochastic linear programming model consisting of thousands of decision variables, constraints, and an unmanage- able number of possible scenarios: a computa-

tionally intractable problem. Since the 195Os’, when early works on planning under uncertainty were published, solution methods have been stud- ied. Recent advances have opened up new av- enues. In [26], a collection of papers on improve- ments in solution methodologies are given. Also, for the most recent advances in implementation technology of the solution algorithms on super- computers see [24,25,27,28,31,32]. The most re- cent solution approach couples a decomposition method with Monte Carlo importance sampling, assigning the sampling task to a parallel com- puter.

A multi-period stochastic linear programming problem presents a staircase structure for plan- ning problems over time; in this activities initi- ated in a given period have input and output coefficients and the next step. So, for a large-scale real system when the number of periods in a planning horizon increases and/or there is un- certainty, the problem size becomes large and intractable. To manage problem size, an specific decomposition approach [ll] is applied so that the original linear program is decomposed into a set of smaller ones-one for each time period. Thus a master problem and a set of subproblems are produced. Then, to incorporate the uncer- tainty in terms of supply and demand at each period, likelihood of failure of subsystems, or introduction of new facilities in the electrical network, different scenarios are considered for each period and its associated subproblem. To represent each period and a given scenario a sub-subproblem is generated. The solution pro- ceeds toward an optimal solution by iteratively solving master-, sub-, and sub-sub problems. The solution to the master problem generates re- quests to the subproblems and its related sub- subproblems, etc.

It is not difficult to imagine that for large-scale systems a closed form solution is intractable and sampling technique must be applied. In the con- text of an electrical power system, this means that “we sample contingencies in which multiple fail- ures occur at a higher frequency than they occur naturally. Multiple failures lead to loss of signifi- cant system capacity, which in turn forces the solution to hedge properly to avoid such failures [221.”

If we assume that the planning horizon in- cludes two periods and the problem is solved by a

Page 6: Supercomputing in corporate America: A sample survey

296 Research Information & Management

single, dedicated processor of a parallel super- computer with many processors. Then the opti- mal solution to the master problem provides in- put to the second period subproblems, where each is assigned to another processor, etc. The aggregated solutions of all sub-subproblems gen- erates a new input to the master problem that should be solved by its dedicated processor. This iterative process is continued until an optimum or near-optimal solution to the overall problem is obtained.

The electrical power planning problem should be solved efficiently on supercomputers. Some preliminary results indicate that a very accurate solution can be obtained for a toy problem con- sisting of three generator, three transmission links, and two time periods. Research for electri- cal power planning is continuing.

use, current adoption and plans of corporate America with respect to the new technology. The following questions are addressed: Are corpora- tions using supercomputers? Is a corporate cul- ture in place to support its use? What are short- term projections concerning the application of supercomputing technology?

As far as we can determine, there are no published answers to these important questions. This paper gives results of a 1989 study of 201 American corporations on their current and ex- pected use of supercomputers for mainstream business applications. Specifically, this paper ad- dresses the current corporate overall awareness of the “commercial” potential of supercomput- ing, ability to “work with” supercomputers, cur- rent utilization of supercomputing technology, and perceived usage in the industries.

3.2. Methodology 3. The corporate survey

3.1. Objectives

Given the forecast that supercomputing will (1) have a significant impact on the data process- ing operations of corporations, and (2) be a com- petitive necessity in the 1990s this study attempts to measure the degree of awareness, ability to

Table 1

A list of the IS industries represented within the survey.

1 2

4

6

8 9

10 11

12 13 14

15 16 17

18

Aerospace

Banking

Chemicals

Consumer Products

Diversified Services

Equipment and Materials Manufacturing

Financial Services

Insurance

Industrial and Automotive Manufacturing Petroleum and Mining Health & Drugs

Food, Beverage, and/or Tobacco Leisure/Entertainment

Retailing Transportation

Electronics Communications

Utilities

Note: The number by each industry in this table is used to

represent this industry in Table 2 and Figures 1 and 5.

In September 1989, a pilot-tested survey was administered to the top 795 firms in the 1988 Forbes listing [34]. The survey had two sections: demographic aspects and corporate supercomput- ing.

The survey was mailed to the chief executive officer. A cover letter asked that the attached survey be forwarded to the chief information officer or director of information systems.

Twenty-three of 795 firms wrote or phoned stating that company policy prohibits them from participating in any surveys. Two hundred and four corporations returned completed surveys. Two hundred and one of these were useable. This represents a response rate of 25%.

Eighteen different industry types were repre- sented within this study. See Table I for a list of these. The 1988 average market value of all re- spondents, as reported by Forbes, was $9.5 mil- lion. The number of employees ranged from 200 to 32,500. Sixty six percent of the firms have a data processing staff of 200 or more. Table 2 illustrates the financial profiles of the responding corporations.

3.3. The survey results

In the following sections, we present the re- sults and discuss their organizational-wide and industrial-wide implications.

Page 7: Supercomputing in corporate America: A sample survey

Information & Management M. M. Amini, R. E. Schooley / Supercomputing 297

Table 2

Financial profiles of responding corporations (as reported in Forbes, April 25, 1988).

Industry

1

2

3

4

5

6 7

8

9

10

11

12

13 14

15

16

17

18

Freq.

3

46

10

8

6 4

5 6

10

15 7

9 6

7

10

8

12

28

201

Average Average

assets sales

($B) (SB)

5.3 8.2

7.6 1.7

7.1 6.7

2.5 3.1

5.1 2.8

2.4 2.5

32.9 4.5

23.7 8.0

4.1 4.8

11.5 11.0

3.2 3.8

3.1 4.1

1.7 1.4

4.1 7.4

5.2 4.0

3.4 3.8

6.4 3.6

7.2 2.5

Average Average Average Average

market net cash num. of

value profit flow employees

(%B) 6M) GM) (Thous.)

2.6 372 713 85

0.7 -47 -8 9

5.8 518 988 35

1.9 164 289 20

2.3 186 346 15

1.7 115 211 23

1.5 186 241 11

2.6 308 361 19

2.3 211 422 31

5.0 11 635 29

3.4 208 303 27

3.8 253 384 45

2.2 159 273 18

30.8 1,110 375 86

1.8 147 417 32

3.2 254 422 43

4.3 321 824 25

2.2 265 516 11

3.3.1. Supercomputing awareness vorable impacts on performance, it is then mi- In [67], a six-stage model describes the succes- grated to the MIS department for enterprise-wide

sive stages in the evolution and maturation of use. information technology in an organizational con- In a number of large companies, supercomput- text. New information technology is often adopted ers have been used in research and development and tested by one department or segment of an departments to process scientific/ engineering organization. If this technology demonstrates fa- applications. The answer to the question “How

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18

Industry

Fig. 1. Supercomputing business application awareness in the sampled corporations.

Page 8: Supercomputing in corporate America: A sample survey

298 Research Information & Management

some AvarenesS

2

very Little Awareness 1

2.11

0.65

Fig. 2. Supercomputing business application awareness by functional area.

aware are American corporations of the “com- indicates that, regardless of industry type, there is mercial”potentia1 of supercomputers?” By aware- very little awareness. Figure 2 shows the aware- ness, we mean how much the respondent knows ness levels of nine functional areas, regardless of about the present “commercial” supercomputing industry type. The information systems and the and/or the potential “commercial” applications research and development functional areas scored in their firms. highest.

Figure 1 provides an average of the awareness levels broken down by industry. The survey data

The results are not surprising. Unlike super- computing in science and engineering, commer-

2.03 /

1.56 ,

i

systms system Application mata Babe End

Analysts Programner Programmer PerSClIInel "SCCS

Fig. 3. Ability of IS staff of develop business applications on supercomputers.

Page 9: Supercomputing in corporate America: A sample survey

Information & Management

cial supercomputing is relatively new. Also, for competitive purposes, a limited number of com- panies conduct confidential research with super- computers. Beside marketing products, there is little attempt to discuss research results publicly. The lack of end-user’s awareness must be viewed as an alarming sign in terms of corporate global competitiveness.

3.3.2. Ability to work with supercomputers Although the overall awareness of corporate

America about the business potential of super- computing is low, the IS departments show a satisfactory degree of knowledge of it. The ques- tion then becomes whether or not IS groups can develop new and/or modify existing commercial applications to run within a supercomputer envi- ronment.

Most of the 201 IS directors thought that their systems development personnel would be able to develop commercial applications for a supercom- puter. The directors also indicated that end users would also be able to develop supercomputer applications. Figure 3 illustrates these assump- tions.

IS groups have taken the leadership, as ex- pected, in developing a basic ability to utilize this technology. A favorable environment exists and IS directors must now find the corporate applica- tions areas that were once written off as impossi- ble for technical reasons.

MM. Amini, R.E. Schooley / Supercomputing 299

One major concern is whether supercomputers will allow organizations around the world to sur- vive global competition in this decade and be- yond. A multi-pronged effort should be under- taken to establish innovative and cross-disci- plinary academic research and educational pro- grams, strong national laboratory-academic-in- dustry relationships, and a well documented and growing base of experience in commercial super- computing applications. These efforts may be or- ganized and coordinated at national and/or in- ternational levels.

To some degree such efforts have been orga- nized by supercomputer vendors. Both IBM and Cray offer free-of-charge seminars to train IS people in the use of supercomputers and to give them ways to migrate their applications. The Na- tional Science Foundation and its associated na- tional laboratories (e.g. University of Illinois Su- percomputing Center, NPAC Center at Syracuse University, Oak Ridge Joint Institute of Compu- tational Sciences, Pittsburgh Supercomputing Center, San Diego Supercomputing Center, etc.) have established supercomputing facilities for re- searchers and educators and as well as some of the vendors. Joint-ventures have also been estab- lished between vendors and universities; e.g. IBM Optimization Center in association with the Georgia Technological University, Digital and Thinking Machine Corporation with the Univer- sity of Pennsylvania, and Sequent with the South- ern Methodist University.

Percentage

80

60

o,,,,,,,,,,,,, I , 1 2 3 4 5 6 7 8 9 10 11 12 13 1: 15 16 17 1;

Industry

m Current @f@ Future

Fig. 4. IS directors’ estimates of current and future supercomputing usage.

Page 10: Supercomputing in corporate America: A sample survey

300 Research Information & Management

3.3.3. Present supercomputer utilization Have IS directors taken the initiative in dis-

cussing the capabilities and limitations of super- computers with corporate leadership? Are corpo- rations ready to adopt supercomputing?

When asked to describe their current super- computer status, 71% of the IS directors indi- cated that their firms do not have one nor have they ever formally evaluated their worth. Eight percent revealed that their firms had considered but rejected the idea of purchasing a supercom- puter. Six percent stated that they are currently considering leasing or purchasing one. Four per- cent have leased supercomputer time from an external source. The remaining 11% stated that they are currently leasing or own a supercom- puter.

dustry) who are currently using a supercomputer, IS directors’ answers averaged 20%. When asked to estimate what percentage of their competitors are considering using a supercomputer, they as- sumed another 15% would in the next one to three year time frame. Figure 4 depicts these short-term supercomputer projections.

This forecast is encouraging. As more corpora- tions investigate and utilize supercomputers, the motivation to remove major implementation bar- riers will intensify.

3.3.5. Supercomputing applications

The fact that almost three quarters of the corporations surveyed have not investigated this technology is discouraging. When asked why, the most popular responses were: “Supercomputers are not cost-effective,” and “No current applica- tions require their processing power.”

The firms which are currently utilizing or plan- ning to use the technology listed their applica- tions; Table 3 presents them. For proprietary reasons, IS directors were not specific.

4. Summary and conclusions

3.3.4. Projected supercomputer use When asked to estimate the percentage of

their competitors (corporations in the same in-

The landmark MIT study, Management in the 199Os’, identified supercomputers and novel ar- chitectures as one of the four key information technology factors that will impact businesses and governments worldwide in the future. Most of IT

Table 3

Scientific/engineering and business supercomputing in the sampled corporations.

Science and engineering

applications

Business applications

Burn Analysis

CAD

CAE Chemical Process

Cooling Analysis

Computational Chemistry Design Optimization

Electrical Circuit Simulation

Environment Modeling

Finite Element Analysis

Fluid Dynamic Gas Management

Hardware Modeling

Image Processing Nuclear Simulation

Mechanical Analysis Molecular Modeling

organic Compound Simulation Physical Modeling

Reservoir Modeling Seismic Processing

Structural Analysis

Aircraft Routing

Database Management

Demographic Modeling

Distribution

Econometric Simulation

Expert Systems

Financial Modeling

Financial Reporting

Forecasting

Load Flow Analysis

Marketing Research Reservation Systems

Resource Allocation Simulation of Business Operations

Text Retrieval Training

Transaction Processing Underwriting Analysis Yield Management

Page 11: Supercomputing in corporate America: A sample survey

Information & Management

technology will soon be a competitive necessity rather than a competitive advantage; thus, super- computers potentially can introduce a major qual- itative change by being able to solve some of the most difficult decision-making problems in real- time with high accuracy.

A variety of commercial applications have been successfully implemented on supercomputers, but though the qualitative improvements have been impressive, commercial supercomputing has been limited to a few companies within a few indus- tries.

Based on our survey of U.S. companies, we make the following observations. There is wide agreement across a number of industry types as to their definition of supercomputing, but busi- ness awareness of it is low or nonexistent throughout all functional levels and areas. In most corporations, IS directors believe that their systems development staff will be able to work with supercomputers. A majority, however, have never formally considered purchasing a super- computer, though over 20% of the Fortune 500 firms are either considering purchasing a super- computer or are currently using one. Corporate IS directors also expect supercomputing within their industry to increase in the next three years.

Based on our survey, the discussions of how to remove implementation and usage barriers, and the efforts of supercomputer vendors, we con- clude that supercomputers should be viewed as providing “competitive advantage.” However, as the demands of competing in a global market place intensify, supercomputers will become “competitive necessities.” The time for corpora- tions to recognize this dramatic change has come and the time to begin adapting corporate IT culture is at hand.

Acknowledgements

The authors wish to thank those corporations who took the time and care to complete the questionnaire. Without their efforts this survey would have been impossible. Thanks also to our graduate students Hsin-Chung Chen, Adam Huarng, and Ravindra Krovi for their statistical and data handling assistance. Also, we would like to acknowledge the helpful comments made by the editor and four anonymous referees that sig-

M.M. Amini, R.E. Schooley / Supercomputing 301

nificantly improved the content and presentation of this paper.

References

111

121

[31

[41

[51

161

[71

181

Adams, D. Parallel Processing Implications for Manage-

ment Scientists. Interfaces. 20, 3, 1990, 88-98.

Akkus, D., Audley, D., and Carlson, E. Parallel Process-

ing in the Capital Markets: Interactive Supercomputing.

The Proceedings of the Supercomputing USA/Pacific 91

Conference, Santa Clara, CA, June 1991, 112-113.

Allen, W. Visualizing L.A. Smog. Supercomputing Re- view. February 1991, 26-27.

Angeleri, P., Lozupone, D., Piccolo, F. and Clinckemail-

lie, J. PAM-CRASH on the IBM 3090/VF: An Inte-

grated Environment for Crash Analysis. IBM Systems Journal. 27, 4 (1988), 541-560. Appel, M., and Hellerman, E. Census Bureau Experi-

ments with Automated Industry and Occupation Coding.

Proceedings of American Statistical Association. 1983, 32- 40. Bartlett, W. Mortgage-Backed Securities. New York In-

stitute of Finance Corp. 1989.

Beal, E. On Minimizing a Convex Function Subject to

Linear Inequalities. Journal of Royal Statistical Society. 17b, 1955, 173-184.

Beasley, J.E. Supercomputers and OR. Journal of Opera- rional Research Society. 38, 11, 1987, 1085-1089.

[9] Bell, G. The Human Genome Project. The Proceedings

of the Supercomputing USA/Pacific 1991 Conference,

Santa Clara, CA, June 1991, 19-21.

[lo] Bell, G. The Future of High Performance Computers in

Science and Engineering. Communication of the ACM. 32, 9, September 1989, 1091-1101.

[ll] Benders, J. Partitioning Procedures for Solving Mixed-

variable programming problems. Numerische Mathematik 4, 1962, 238-252.

[12] Blueston, M. These Shop Floor Computers Can Think

Superfast. Business Week July 27, 1987, 68. [13] Brooks, B. Method and Applications of Molecular Dy-

namics for Problem in Structured Biology on Parallel and

Pipelined Machines. The Proceedings of the Supercom-

puting USA/Pacific 91 Conference, Santa Clara, CA,

June 1991, 45.

[14] Brown, P., and Picheny, M. Toward Automatic Dictation.

Supercomputing at IBM Research. 1987, 28-29.

[15] Buzbee, B. and Sharp, D. Perspective on Supercomput-

ing. Science. February 1985, 591-597.

[16] Carolan, W., Hill, J., Kennington, J., Niemi, S., and

Wichmann, S. An Empirical Evaluation of the KORBX

Algorithms for Military Airlift Applications. Operations Research. 38, 2, 1990, 240-248.

[17] Chevrin, R. Global Change: Climate Modeling. Proceed-

ings of the Supercomputing USA/Pacific 91 Conference, Santa Clara, CA, June 1991, 46.

[18] Cohen, K., Maier, S., and Vander Weide, J. Recent

Developments In Management Science in Banking. Man- agement Science. 27,10, 1981, 1097.

[I91 Creecy, R., Masand, B., Smith, S., and Waltz, D. Trading

Page 12: Supercomputing in corporate America: A sample survey

302 Research Information & Management

MIPS and Memory for Knowledge Engineering: Auto-

matic Classification of Census Returns on a Massively

Parallel Supercomputer. Thinking Machine Co., MA,

1991.

[ZO] Crowder, H. Supercomputing in the Office: a New Tool for OR/MS Decision Makers. The Proceedings of the

National DSI Conference, San Diego, November, 19-21,

1990, 794.

[21] Crowder, H.P. Supercomputers in the Office. Business Perspectives. Spring 1990, l-5.

[22] Dantzig, G. Parallel Processors for Planning Under Un-

certainty. Annals of Operations Research. 22, 1990, l-21.

[23] Dantzig, G. Linear Programming Under Uncertainty.

Management Science. 1, 1955, 97-206.

[24] Dantzig, G. Decomposition Techniques for Large-Scale

Electric Power Systems Planning Under Uncertainty. Im-

pacts of Recent Computer Advances on Operations Re-

search. North Holland. 1989, 3-20. [25] Dantzig, G. Planning Under Uncertainty Using Parallel

Computing. Annals of Operations Research. 14, 1988, 16. [26] Dantzig, G., Dempster, M., and Kallio, M., eds. Large-

Scale Linear Programming. Vols. 1, 2, IIASA Collabora-

tive Proceedings Series, CP-81-51, IISAS, Laxenburg,

Austria, 1981.

[27] Dantzig, G., and Pereira, M. et al. Mathematical Decom-

position Techniques for Power System Expansion Plan-

ning. EPRI EL-5299, Vols. 1-5, Electric Power Research

Institute, Palo Alto, CA, 1988.

[28] Dantzig, G., Glynn, P., Avriel, M., Stone, J., Entriken,

R., and Nakayama, M. Decomposition Techniques for

Multi-Area Transmission Planning Under Uncertainty.

Final Report of ERI Project RP 2940-1, prepared by

Systems Optimization Laboratory, Operations Research

Department, Stanford University, CA, 1988.

1291 Design News. Supercomputing System Aids Shuttle Ef-

fort. September 1987, 34.

[30] Dongarra, J.J. and Duff, IS. Advanced Architecture

Computers. Technical Memorandum 57 (Revision 1).

Mathematics and Computer Science Division, Argonne

National Laboratory, 9700 South Cass Avenue, Argonne,

ILL 60439 (January 19, 1987).

[31] Entriken, R. The Parallel Decomposition of Linear Pro-

grams. Ph.D. Dissertation, Department of Operations

Research, Stanford University, CA, 1989.

[32] Entriken, R. A Parallel Decomposition Algorithm for

Staircase Linear Programs. Oak Ridge National Labora-

tory Report ORNL/TM 11011, 1988. [33] Erisman, A M. and Simon, H.D. From Slide Rules to

Supercomputers. Datamation. [34] The Forbes 500. Forbes. (April 25, 19881, 136-350. [35] Freundlich, N. An Electronic Referee for Squabbling

Economists. Business Week. October 16, 1989, 72. [36] Fukumoto, A First-Principles Calculations of Elastic

Properties of Semiconductors. The Proceedings of Super- computing Japan 91 Conference, Tokyo, April 1991,

176-179. [37] Fukushima, K. Miyake, S., and Ito, T. Neocognitron: A

Neural Network Model for a Mechanism of Visual Pat- tern Recognition. IEEE Transactions on Systems, Man and Cybernetics, 1983, SMC-13, 5, 826-832.

[38] Gasser, L. Large-scale Concurrent Computing in Artifi-

cial Intelligence Research. Proceedings of the Third Con-

ference on Hypercube Concurrent Computers and Appli-

cations. 1988, 1342-1351.

[39] Gehringer, E.F., Abullarade, J. and Gulyn, M.H. A Sur-

vey of Commercial Parallel Processors. Technical Report,

Computer Systems laboratory, North Carolina State Uni-

versity, Raleigh, NC 27695-7911 (1988).

[40] Guadagni, P., and Little, J. A Logit Model of Brand

Choice Calibrated on Scanner Data. Marketing Science. 2, 3, 1983, 203-238.

[41] Gullo, K. and Schatz, W. The Supercomputer Breaks

Through. Datamation. May 1, 1988, 50-63.

[42] Hart, R. Supercomputing and the Healing Arts: Finite

Element Modeling of Strain-Induced Bone Remodeling.

Projects in Scientific Computing Pittsburgh Supercomput- ing Center. 1988, 35-37.

[43] Himeno, R. Numerical Analysis and Visualization of

Flow in Automobile Aerodynamics Development. Pro- ceedings of the Supercomputing USA/Pacific 91 Confer-

ence, Santa Clara, CA, June 1991, 62-66.

[44] Hutchinson, J., and Zenios S. Financial Simulations on a

Massively Parallel Connection Machine. To appear in

International Journal of Supercomputer Applications. The

MIT Press.

[45] Hwang, K. Exploiting parallelism in Multiprocessors and

Multicomputers. In Parallel Processing for Supercomput-

ers and Artificial Intelligence. McGraw Hill Pub. Co.,

1989, 31-68.

[46] Hwang, K., and Degroot, D. Parallel Processing for Su-

percomputers and Artificial Intelligence. McGraw Hill

Pub. Co., 1989.

[47] Hwang, K. and Briggs, F.A. Computer Architecture and

Parallel Processing. McGraw-Hill Book Co. 1984. [48] Iversen, W. Detroit Revs Up High-Performance Engines

in a Race Against Japanese Supercomputer Horsepower.

Supercomputing Review. November 1989, 21-25.

[49] Kamel, A, Kindelan, M., and Squazerro, P. Seismic Com-

putations on the IBM 3090 Vector Multiprocessor. IBM Systems Journal. 27, 4 (19881, 510-527.

[50] Karmarkar, N. A New Polynomial Time Algorithm for

Linear Programming. Combinatorics. 4, 1984, 373-395. [51] Kelleher, J. Companies Can’t Live on Technology Alone.

Computerworld. June 25, 1990, 67. [52] Kulkosky, V. Trading with Turbo Power Changes Rules

of the Game. Wall Street Computer Review. October 1990.

[53] Kusy, M., and Ziemba, W. A Bank Asset and Liability

Management Model. Operations Research. 34, 3, 1986, 356.

[54] Lanning, E. The Role of High Speed Computing in the Petroleum Industry. The Proceedings of the Supercom-

puting USA/Pacific 91 Conference, Santa Clara, CA,

June 1991, 31-33. [55] Lapedes, A, Btyngelson J., Farber R., Stolorz, P., Wolf,

D. and Xia, Y. Prediction of Protein Structure Using Neural Nets and Information Theory. The Proceedings

of the Supercomputing USA/Pacific 91 Conference,

Santa Clara, CA, June 1991, 30. [56] Lerner, E. Parallel Processing gets Down to Business.

High TechnoIogy. July 1985, 20-28. [57] Levesque, J. Reservoir Simulation on Supercomputers.

Society of Petroleum Engineers. April 1985, 275-279.

Page 13: Supercomputing in corporate America: A sample survey

Information & Management M.M. Amini, R.E. Schooley / Supercomputing 303

[58] Madnick, S. The Information Technology Platform. The

Corporation of the 1990s: Information Technology and

Organizational Transformation. Edited by M.S. Morton.

Oxford University Press. 1991, 27-60.

[59] Manuel, T. Supercompu&ers; The Proliferation Begins.

Electronics. March 3, 1988, 51-54.

[60] Marino, C., Clifford, G., Long, M., and Misegades, K.

The Supercomputer And the Automative Industry. Au- tomatiue Engineering. November 1989, 58-62.

[61] Morton, M.S. The Corporation of the 1990s: Information

Technology and Organizational Transformation. Oxford

University Press. 1991, 3-23.

[62] Mulvey, J. Nonlinear Network Models in Finance, in:

Aduances in Mathematical Programming and Financial Planning. JAI Press, 1987, 253.

[63] Mulvey, J., and Vladimirou, H. Stochastic Optimization

Models for Investment Planning. Annals of Operations Research. 20, 1989, 187-217.

[64] Maisel, S. Ed. Risk and Capital Adequacy in Commercial

Banks. The University of Chicago Press, Chicago, ILL,

1981.

[65] Nagashima, U. Supercomputer and Computational

Chemistry in Japan. The Proceedings of Supercomputing

Japan 91 Conference, Tokyo, April 1991, 164-169.

[66] Neubarth, M. Supercomputer Firms Eye General Busi-

ness Market. MIS Week. 10, 4, January 1989, 10.

[67] Nolan, Richard L. Managing the Crisis in Data Process-

ing. Haruard Business Review. March-April, 1979.

[68] Pastore, R. Multiprocessors Still Wait for Software.

Computerworld. July 2, 1990, 33.

[69] Pavan, S. Just Off the Assembly Line. Wall Street Com- puter Journal. November 1989, 50.

[70] Penczer, P. Supercomputers Era Dawns on Wall Street.

Wall Street Computer Review. November 1989, 41-74.

[71] Perold, A., Large Scale Portfolio Optimization. Manage- ment Science. 30, 10, 1984, 1143.

(721 Petersen, C. Computer Simulation of Large-Scale Econo-

metric Models: Project Link. The International Journal of Supercomputer Applications. 1, 4, 1987, 31-53.

[73] Quinn, M.J. Designing Efficient Algorithms for Parallel

Computers. McGraw-Hill Book Company. 1987, 2-4.

[74] Reed, K. Super Power: Never Enough. Computerworld. December 22, 1986, 41.

[75] Riganati, J., and Schneck, P. Supercomputing. Computer. October 1984, 97-113.

[76] Rossi, A., and Silverman, D. Calculating Polymer Struc-

tures. Supercomputing at IBM Research. 1987, 8-9. [77] Smarte, G., and Penney, W. Sounds and images. BYTE.

December 1989, 243-256.

[78] Staubbs, G. Industry Needs Design-Automation Experts

to Unleash the Power of Supercomputers. EDN. August

7, 1986, 259-262. [79] Sumi, A Prediction of Climate System by Using Numeri-

cal Models. The Proceedings of Supercomputing Japan 91 Conference, Tokyo, April 1991, 193-194.

I801 Svarney, P. Supers Are Making a Big Splash, Turning the Lides in Ocean Research. Supercomputing Review. February 1991, 43-44.

[Bl] Svarney, P. Planetary Snapshots: A Matter of image.

Supercomputing Review. November 1989, 30-38.

[82] Szego, G. Portfolio Theory: With Application to bank

asset Management. Academic Press, New York, 1980.

(831 Tinoco, E. The impact of Supercomputing on Aircraft

Design at Boeing. The Proceedings of the Supercomput-

ing USA/Pacific 91 Conference, Santa Clara, CA, June

1991, 74-80.

[84] Traub, R. Probing The Brain With A Computer. Super- computing at IBM Research. 1987, 10-11.

[85] Venkatraman, N. IT-induced Business Reconfiguration.

The Corporation of the 1990s: Information Technology

and Organizational Transformation. Edited by MS. Mor-

ton. Oxford University Press. 1991, 122-158.

[86] Wallach, S. A Supercomputer is a Supercomputer, Some-

times. Supercomputing Rer’iew. November 1989, 59.

1871 Waltz, D. intelligence Database Applications of Mas-

sively Parallel Supercomputers for the 1990’s. Proceed-

ings of Supercomputing Japan 91 Conference, Tokyo,

April 1991, 13-18.

[88] Wilke, J. Parallel Processing Computers Attract Crowd

of Investors Despite Limited Uses. The Wall Street Jour-

nal. Friday, October 5, 1990.

[89] Zenios, S. Parallel Running. Risk 3, 10, November 1990, 29-31.

[90] Zenios, S. A communicated note., December 20,199O.

[91] Zenios, S. A Parallel Numerical Optimization: Current

Status and an Annotated Bibliography. ORSA Journal on Computing. 1, 1, Winter 1989, 20-43.

[92] Zenios, S. Massively Parallel Computations for Financial

Planning Under Uncertainty. To appear in Very Large

Scale Computations in the 21st Century. SZAM. Philadel-

phia, 1991.

(931 Zenios, S. Parallel Monte Carlo Simulation of

Mortgage-Backed Securities. To appear in Financial Op- timization, Cambridge University Press.

[94] Ziemha, W., and Vickson, R. eds. Stochastic Optimiza-

tion Models in Finance. Academic Press, New York, 1975.