computing, research, and war: if knowledge is ... - potsdam › faculty › laddbc › teaching ›...

9
Social Aspects of Computing Rob Kling Editor Computing, Research, and War: If Knowledge is Power, Where is Responsibility? Jack Beusmans and Kien Wieckert In the United States, artificial intelligence (AI) research is mainly a story about military support for the devel- opment of promising technologies. Since the late 1950s and early 196Os, AI research has received most of its support from the military research establishment [37, 551.’ Not until the 198Os, however, has the military connected this research to specific objectives and prod- ucts. In 1983, the $600-million Strategic Computing Program (SCP) created three applications for I‘ ‘pulling’ the technology-generation process by creating carefully selected technology interactions with challenging mili- tary applications” 1161. These applications, an autono- mous land vehicle, a pilot’s associate, and a battle man- agement system, explicitly connect the three armed services to further AI developments [29, 51, 531. The Defense Science Board Task Force on the “Military Ap- plications of New-Generation Computer Technologies” recommended warfare simulation, electronic warfare, ballistic missile defense and logistics management as also promising a high military payoff [18]. In his 1983 “Star Wars” speech, President Reagan en- joined “the scientific community, . . . those who gave us nuclear weapons, . . . to give us the means of rendering these nuclear weapons impotent and obsolete” [43]. As in the Manhattan and hydrogen bomb projects, AI re- searchers and more generally computer scientists are expected to play major parts in this quest for a defen- sive shield against ballistic missiles. Computing special- ists such as John von Neumann played a supportive role by setting up the computations necessary for these engineering feats-with human “computers” for the atom bomb [lo]’ and with ENIAC and other early com- puters for the hydrogen bomb [9]. The “Star Wars” proj- ect challenges computer scientists to design an intelli- gent system that finds and destroys targets-basically in real-time and without human intervention. The interdependence of the military and computer science rarely surfaces during our education as com- puter practitioners, researchers, and teachers. Where might information concerning these important military applications enter into computer science and AI educa- tion? Where do students receive information concern- ing the important role they may play in weapon sys- tems development? One of our students recently remarked that “as a computer science major, I did not realize the magnitude of the ramifications of advancing technology for the military , . . . In a field so dominated by the DOD, I will have to think seriously about what I am willing and not willing to do-and what lies in between those two poles.“3 As researchers and educators, the authors wish to encourage colleagues and students to reflect upon pres- ent and historical interactions between computer sci- ence as an academic discipline and profession, and mil- itary projects and funding. As computer professionals, we lay claim to specialized knowledge and employ that knowledge in society as developers of computing tech- nologies. Thus, we exercise power. Recognizing that as professionals we wield power, we must also recognize that we have responsibilities to society. To act responsi- bly does not mean that computer professionals should advocate a complete separation between computer sci- ence and military missions. However, we should openly examine the inter-relationships between the military and the discipline and practice of computing. To act responsibly does not mean that computer scientists and ’ In the early 1980s before SCP began in 1984. DOD funded about 70 percent of ‘Physicist Richard Feynman remembers the calculations: “We set up the Al-related R&D ($21.6 million). The remaining 30 percent was funded by NSF room with girls in it. S/w was the multiplier. and sire was the adder. and (about $5.5 million) and NIH (about $3.9 million). In 1985. DOD funded about this one cubed. and wo had index cards. and all she did was cube this 50 percent of basic and applied research in academic computer science. number and send it to the next one.” 0 1989 ACM 000%0782/89/0800-939 $1.50 ‘From student course evaluations of “Computers and Military Technology.” Winter 1987. ICS Department. University of California. Irvine. August 1989 Volume 32 Number 8 Communications of the ACM 939

Upload: others

Post on 03-Jul-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Computing, Research, and War: If Knowledge is ... - Potsdam › faculty › laddbc › Teaching › Ethics › ...Defense Science Board Task Force on the “Military Ap- plications

Social Aspects of Computing

Rob Kling Editor

Computing, Research, and War: If Knowledge is Power, Where is Responsibility?

Jack Beusmans and Kien Wieckert

In the United States, artificial intelligence (AI) research is mainly a story about military support for the devel- opment of promising technologies. Since the late 1950s and early 196Os, AI research has received most of its support from the military research establishment [37, 551.’ Not until the 198Os, however, has the military connected this research to specific objectives and prod- ucts. In 1983, the $600-million Strategic Computing Program (SCP) created three applications for I‘ ‘pulling’ the technology-generation process by creating carefully selected technology interactions with challenging mili- tary applications” 1161. These applications, an autono- mous land vehicle, a pilot’s associate, and a battle man- agement system, explicitly connect the three armed services to further AI developments [29, 51, 531. The Defense Science Board Task Force on the “Military Ap- plications of New-Generation Computer Technologies” recommended warfare simulation, electronic warfare, ballistic missile defense and logistics management as also promising a high military payoff [18].

In his 1983 “Star Wars” speech, President Reagan en- joined “the scientific community, . . . those who gave us nuclear weapons, . . . to give us the means of rendering these nuclear weapons impotent and obsolete” [43]. As in the Manhattan and hydrogen bomb projects, AI re- searchers and more generally computer scientists are expected to play major parts in this quest for a defen- sive shield against ballistic missiles. Computing special- ists such as John von Neumann played a supportive role by setting up the computations necessary for these engineering feats-with human “computers” for the atom bomb [lo]’ and with ENIAC and other early com- puters for the hydrogen bomb [9]. The “Star Wars” proj-

ect challenges computer scientists to design an intelli- gent system that finds and destroys targets-basically in real-time and without human intervention.

The interdependence of the military and computer science rarely surfaces during our education as com- puter practitioners, researchers, and teachers. Where might information concerning these important military applications enter into computer science and AI educa- tion? Where do students receive information concern- ing the important role they may play in weapon sys- tems development? One of our students recently remarked that “as a computer science major, I did not realize the magnitude of the ramifications of advancing technology for the military , . . . In a field so dominated by the DOD, I will have to think seriously about what I am willing and not willing to do-and what lies in between those two poles.“3

As researchers and educators, the authors wish to encourage colleagues and students to reflect upon pres- ent and historical interactions between computer sci- ence as an academic discipline and profession, and mil- itary projects and funding. As computer professionals, we lay claim to specialized knowledge and employ that knowledge in society as developers of computing tech- nologies. Thus, we exercise power. Recognizing that as professionals we wield power, we must also recognize that we have responsibilities to society. To act responsi- bly does not mean that computer professionals should advocate a complete separation between computer sci- ence and military missions. However, we should openly examine the inter-relationships between the military and the discipline and practice of computing. To act responsibly does not mean that computer scientists and

’ In the early 1980s before SCP began in 1984. DOD funded about 70 percent of ‘Physicist Richard Feynman remembers the calculations: “We set up the Al-related R&D ($21.6 million). The remaining 30 percent was funded by NSF room with girls in it. S/w was the multiplier. and sire was the adder. and (about $5.5 million) and NIH (about $3.9 million). In 1985. DOD funded about this one cubed. and wo had index cards. and all she did was cube this 50 percent of basic and applied research in academic computer science. number and send it to the next one.”

0 1989 ACM 000%0782/89/0800-939 $1.50 ‘From student course evaluations of “Computers and Military Technology.” Winter 1987. ICS Department. University of California. Irvine.

August 1989 Volume 32 Number 8 Communications of the ACM 939

Page 2: Computing, Research, and War: If Knowledge is ... - Potsdam › faculty › laddbc › Teaching › Ethics › ...Defense Science Board Task Force on the “Military Ap- plications

practioners should eschew support or employment from the military, although some are justified in taking such a stance.4 To act responsibly requires attention to the social and political context in which one is embedded; it requires reflection upon individual and professional practice; it requires alpen debate. The lack of attention to issues of respon.sibility in the typical computer sci- ence curriculum strikes us as a grave professional omis- sion. With this article, we hope to add material to the dialo,gue on appropriate computing applications and their limits. We also hope to provoke reflections on computing fundamentals and practice at the individual, professional, and disciplinary levels, as well as prod- ding government institutions, professional societies, and industry to support in-depth research on the issues we raise here.

Reflection requi:res information and discussion. Aca- demic computer science departments rarely support se- rious consideration of even general issues under the rubric of the social. and ethical implications of comput- ing. IJnlike any other U.S. computer science depart- ment, Information and Computer Science (ICS) at UC Irvine has an active research program in the social im- plications of computing (Computers, Organizations, Pol- icy and Society-CORPS). Even within CORPS, re- search that addresses the interactions between the military and computer science is difficult to pursue- not because individuals aren’t interested, but because they are not able to find professional or academic sup- port. The authors’ interests in these issues arose from personal concerns over the dependence of military sys- tems upon complex technology, and the possible grave outcomes of this fragile relationship. CORPS provided a supportive intellectual environment that allowed us to pursue our interests. In 1987, we developed and taught an undergraduate course designed to inform students about military applications and their limits, and allow dialogue on professional responsibilities. In general, lit- tle monetary support is available for research that con- siders these issues, and it is only through support from the Institute on Global Conflict and Cooperation and campus instructional funds that we were able to de- velopi and teach the course.

Few researchers or educators can devote time and/or energy to pursue the social and ethical implications of their work and profession, in addition to their “main- stream” research. !%nce the discipline of computer sci- ence does not consider these reflections serious “main- stream” research, those who chose to pursue these vital questions have difficulties finding employment and/or advancing through. the academic ranks. Growing con- cern over these issue.5 and interest by computer scien- tists, as evidenced by the group Computer Professionals for Social Responsibility [38], individuals such as David Parnas [39], and this article, may lead to future re- search support and academic recognition.

4See for example. I. C\‘eizenb,~um. “Not Without Us.” Conry~rfcrs axd Sot-icly 16, 2-3 (Summer/Fall 1986). 2-7: and T Winograd. “Strategic computing research

For now, as concerned professiona lowing reviews. They pose many more answers. This article investigations which are required as precur. ous analysis of computing use in these applicatiome hope that our reviews generate discussion and debate. In the first section, we present the course ra.tionale and content, as well as student responses. In the sections following the course description, we consider three ap- plications-smart weapons, battle management, and war game simulations-that are generating research and development funds and that have controversial im- plications for military uses of computing. We start with smart weapons, that is, the development of weapons that can destroy targets with minimal human interven- tion. Next we look at battle management systems de- signed to coordinate and assess the use of resources and people in warfare. Finally, we turn to war gaming as a

If created an artificial reality in which people were reduced to blips on a display screen-blips that must be stopped, rather like a video arcade game.

means for evaluating weapon performance and strate- gies for war fighting. In each case, we describe the state of technology, its current and potential uses and its implications for the conduct of war.

A COURSE ON COMPUTERS AND MILITARY TECHNOLOGY

Motivation A university degree, either in computer science or a cognate discipline such as mathematics, serves as the certification process for computer practitioners. Individ- uals with such degrees are assumed to possess special- ized and useful knowledge for developing industrial and military products. During their univers:ity years, few students learn about the position of their profession in society, or the individual, group and soci’etal ramifi- cations of the products they will help develop. More often than not, if computer professionals confront the implications of computing, they do so after {completing their university education. This lack of emphasis on the implications of computing in the universities leaves computer practitioners ill-prepared to make personal or professional decisions not directly following from their technical training, and in particular to understand ap- plications within the military sector. In our opinion, this is a fundamental oversight in the education of com- puter professionals. And since many of our students find employment within the Southern California de- fense industry, and overall large numbers of computer professionals work on military projects, we felt it ap- propriate that prospective computer professionals be exposed to the use of computers in the military.

Even without special courses, students could gain ex-

940 Communications of th,? ACM August 1989 Volume 32 Number 8

Page 3: Computing, Research, and War: If Knowledge is ... - Potsdam › faculty › laddbc › Teaching › Ethics › ...Defense Science Board Task Force on the “Military Ap- plications

posure to military applications through discussions in other courses or in textbooks. However, professors rarely analyze these applications in their courses and textbooks give at best superficial treatment to comput- ing applications, including military. As an example, consider a student specializing in artificial intelligence. In his widely used AI textbook, Winston [66] states that one goal of AI is to make computers more useful in business, engineering, manufacturing, farming, mining, education, health and households; he never mentions military applications. And Winston’s textbook is by no means an exception. Military applications are absent from Nilsson’s 16-page prologue on the applica- tions of AI, [36] as well as Rich’s [45] and Charniak and McDermott’s [14] textbooks. Although the encyclopedic Handbook of Artificial Intelligence mentions in passing that “the Advanced Research Projects Agency (ARPA) of the U.S. Department of Defense [is] a major sponsor of AI research,” the detailed discussion of applications focuses exclusively on science, medicine and education [ll]. To be fair, some of these books were written be- fore the SCP and SD1 were announced, yet it couldn’t have escaped the authors’ attention that their research was almost entirely funded by DOD.

This sanitization of AI applications is absent in some non-standard texts on AI. For instance, Zusne’s [69] book on vision discusses military applications of vis- ual perception and reasons for military interest, and Stevens [52] mentions some military applications of im- age understanding systems. But even these treatments are cursory; for example Schutzer [49], who mentions the goals of DARPA’s SCP and a number of other ongo- ing DOD programs, summarizes future applications of knowledge-based systems by giving specific examples for every category but the military.

Because of this superficial treatment, students can earn their bachelor’s and even doctoral degrees in com- puter science without ever learning of military applica- tions, let alone their possible ramifications on the con- duct of war, national and international economies, and professional careers. For this reason, we developed and taught an upper-division course on computers in the military. In ICS, students are required to take a course in Computerization and Society, and in 1987 our course Computers and Military Technology also fulfilled this re- quirement. (See syllabus and reading list.)

Content Approximately 30 ICS students enrolled-half juniors and half seniors. Prior to the course, students had only vague notions about the military role of computers and AI and about the funding of academic research. The students were interested and eager to discuss these is- sues and inform themselves; in spite of the reading load, there was minimal moaning and groaning. The students’ attitudes spanned the political spectrum; some considered the course too conservative, while others found it too liberal. A number of students took the course out of deep political and personal convic-

tions against military applications; course as a springboard into defense others to expose themselves to multipl the militarv. In one course evaluation. a that “I realize that before I enter the working wo Y&I must make a decision concerning whether or not I should work for a defense related company, and if so, to what extent I should work on potentially destructive projects . , . I realize it will require still more thought and information before I decide where I will draw the line.” Another student eloquently argued that computer science departments should inform their graduates “in- stead of. , , belching into the work force ‘computer techies’ who are ignorant as to how they will be used by society.‘”

We presented the IO-week course in four sections. The first section was devoted to descriptions of specific military systems and computing components which en- abled them. We presented the development of technol- ogies such as inter-continental ballistic missiles and multiple independently targeted reentry vehicles, cruise missiles, and command, control, communica- tions and intelligence ((?I) structures and organizations. We also invited a guest lecturer to discuss the history of war gaming and the present use of expert systems to model strategic decision-making.

The second section of the course was devoted to the infrastructure in which computing and the military is embedded. We included lectures on the general history of U.S. science and technology, monetary and organi- zational support for U.S. computing research and devel- opment, and a description of the military-industrial- academic complex. One lecture was devoted specifi- cally to sources of research funding in ICS. This was one of the more popular lectures, since the students hadn’t realized the extent to which our own depart- ment depends upon military funding (almost 50 percent of the funding was from the DOD in 1987).

In the third section, we focused on two cases which exemplified the military plans for computing technol- ogy and how these plans are carried out in the military- industrial-academic arena. We presented a short seg- ment on the Strategic Computing Program, and a larger segment on SDI, including a videotape of a debate be- tween computer scientists on the technical feasibility of the system [44] and a debate between four students in the class. This in-class debate was extremely instruc- tive, not only for those directly involved, but also for the entire class. It taught valuable lessons on the diffi- culties of making sound arguments.

Finally, we examined the effects of technology devel- opment on military strategy, including the impact of anti-submarine warfare on deterrence and the momen- tum toward launch on warning fed by increasingly accurate missiles embedded within “real-time” com- mand, control and communication structures. The final section also included a discussion of personal and professional decisions that other scientists and engi- neers, including computer scientists, have made both

SSee Foolnote 3.

August 1989 Volume 32 Number 8 Communications of the ACM 941

Page 4: Computing, Research, and War: If Knowledge is ... - Potsdam › faculty › laddbc › Teaching › Ethics › ...Defense Science Board Task Force on the “Military Ap- plications

pro and con with respect to military funding, projects, and careers.

The success of the course can be measured not only in the learning experience of the students, but also of the instructors. In the future, we hope to teach this course again. Prior to our course, the required under- graduate and graduate courses in Computerization and Society rarely included sections on military applications. Presently, a draft of this article is used within these courses, and every quarter, one of the authors lectures on military topics (e.g., SDI, C31, and departmental mili- tary funding) in various ICS courses. We hope that in- structors in other institutions can follow our example and c:an use this article and other materials in their own teaching.

INSTRUCTIONAL INSTRUMENT The next three sections present overviews of military applications of computing and AI technology. Each de- scription includes a brief general history of the applica- tion, historical and planned uses of computing technol- ogy within the applicfation, and a short discussion of the implications of the technology. We have included a number of references for each application, which are by no means exhaustive nor completely up-to-date. Programs, plans, and players change frequently in this arena. However, the references point to excellent sources of material, a.nd an instructor could use this article as a starting point for a lecture on the applica- tions. a class on computing technology and the military generally, or possibly for beginning a research en- deavor. The overview also provides a background for students, and may pique their interest to search more widely for information.

SMART WEAPONS In warfare one needs “to shoot at real targets, not just real estate” [34]. Conducting war requires decisions that distinguish real targets from mere real estate, lo- cate these real targets, and finally destroy them. Smart weapons employ AI, particularly in guidance systems, to encapsulate decisions about identifying, precisely lo- catin,g, and destroying targets. In this section, we exam- ine automated guidance from early cruise missiles to present plans for autonomous vehicles.

Th’e U.S. Navy and Sperry Gyroscope Co. performed the first experiments on guided missiles during the last years of the First Wo:rld War [23]. Their aerial torpedo or flying bomb, essentially a pilotless airplane, did not become operational before the end of the war and the program ended after the armistice. During the Second World War, cruise m:issiles (the V-l or buzz bomb) and ballistic missiles (V-2) were first used on a large scale. These missiles relied on a primitive inertial navigation system” to stay on course and were rather inaccurate, often missing targets by many kilometers. Improve-

L’Inertial navigation is based on Newton’s second law. c = nr6. Acceleration is calculated by measuring the lbrce exerted on some test mass. Given the inital velocity and position. velocity and position can then be computed continu- ously and corrected of necessary.

6 s+2%/ 4 ments in inertial navigation over the

\ 4 t

now allow planners to expect warheads $ listic missiles, such as the MX, to land wit of a target 11,000 kilometers away 50 percei time. Unlike ballistic missiles, long range CI siles, with flights as long as five hours, can :miss their targets by considerable distances because slight errors in inertial navigation build up over time [58, 571. Thus, to increase accuracy, additional guidance techniques are being developed.

Today, cruise missiles use the TERCOM (Terrain Contour Matching) system, which measures the height of terrain a missile flies over and compares it with stored altitude maps of the area the missile is expected to pass on its way to a target. When deviations from the expected path are detected, the missile’s course is ad- justed. Since TERCOM works only over hilly terrain, it is supplemented by systems that compare stored terrain images with images taken en route. These images can be based on visible light, infrared, or radar. Related guidance techniques are used in other missiles such as the Exocet.

Although generally referred to as smart, t:hese weapons are presently at best able to home in on the silhouette of a large object selected by a human opera- tor or follow a precomputed path to a target [60, 611. In other words, these weapons are far from being truly autonomous (in the sense of being able to follow ge- neric commands like “destroy those [expletive] ships”), but developments point in that direction [z!j]:

Although artificial intelligence is beginning to evolve in theory as various universities and re- search laboratories delve into robotics, pattern recognition, interactive computing, and signal processing, it is still a long way from initial opera- tional capability in terms of automatic ta.rget identification. The sophisticated algorithms now being evaluated require massive computational circuitry. . . . This means initially developing a seeker that can discern shipping from background landmass; then, the seeker would have t’o be im- proved so it can distinguish friendly shipping from hostile; and, finally, the seeker should evolve into one capable of discerning particular ship classes from one another . . . ongoin,g efforts in very high-speed integrated circuitry (VHSIC) design must come to fruition in order to embed the necessary recognition circuitry in the Toma- hawk [cruise missile].

Indeed, many programs are directed toward. the goal of automatic target identification and guidance; the SCP’s autonomous land vehicle mentioned is one example [15, 28, 351. The Army Science Board recommended that the Army establish a $100-million, three-year ma- chine vision research program for target identification [l]. Boeing Corporation is flight testing the 13oeing Ro- botic Air Vehicle 3000 which can be used for recon- naissance, electronic warfare, and as a dispenser of

942 Communications of the AlCM August 1989 Volume 32 Number 8

Page 5: Computing, Research, and War: If Knowledge is ... - Potsdam › faculty › laddbc › Teaching › Ethics › ...Defense Science Board Task Force on the “Military Ap- plications

other smart munitions [3, 51. Related work concerns teleoperated robots or remotely piloted vehicles, that is, robots controlled by radio signals or a fiber optic cable [l-3, 19, 22, 241. Thus, as one observer put it, Asimov’s “three laws of robotics,” the most important of which states that robots are forbidden to injure people, “will almost certainly be broken soon” [47].

What are some implications of having smart, perhaps autonomous, weapons available for use in warfare? First, to the extent that smart weapons are accurate, they can be expected to minimize collateral (i.e., unin- tended) damage. The expectation of removing only the bad guys with little damage to innocents, can lead to early use of these weapons in conflict, or possibly even initiation of conflict. This tendency may be reinforced by the ability to distance one’s own force from harm, for instance attacking without risking one’s own sol- diers’ lives [8].’ According to Gerstenzang, a Los Angeles Times reporter, observers of the April 15, 1986 raid on Libya, in which laser guided bombs were used, sug- gested that “without smart bombs, the decision never would have been made. . . . Anything that gives you accuracy does provide a real temptation. It sanitizes the use of force . . . there is a perception that if you have a bomb that is smart enough, you can carry out missions without getting in harm’s way. This is a fallacy” [21].8

Second, the qualification “autonomous” implies that a weapon bears responsibility for its own actions, that it is the computer’s fault [30]. This is, of course, nonsense. Responsibility for using an autonomous weapon rests not only with the commanding officer who orders the use of a weapon and the operator who decides to use it, but also with the weapon designers who encapsulate knowledge of real targets and how to locate them. As a thought experiment, consider an incident similar to the My Lai massacre during the Vietnam War, carried out not by humans who can be held directly responsible, but instead by runaway robots. We must actively guard against the possibility that blame can shift to robots that act in some unforeseen manner.

Lastly, increasing automation, such as that used for smart weapons, is but one manifestation of modern warfare that blurs the distinction between soldiers and innocents. In 1970, Senator Proxmire raised this poten- tial for indiscriminate killing in a Senate presentation on the “electronic battlefield” then in use in Vietnam [20, 321, and to which we turn in the next section.

BATTLE MANAGEMENT According to General Westmoreland, former U.S. Army Chief of Staff, “we are on the threshold of an entirely new battlefield concept,” one in which “enemy forces will be located, tracked, and targeted almost instanta- neously through the use of data-links, computer-

7 According lo Admiral Hostettler. a cruise missle “can permit a limited. mea- sured response as an expression of U.S. will and determination without jeop- ardizing aircraft and pilots.”

*A laser-guided bomb can steer itself to a laser illuminated target. Some bombs did indeed miss their targets by considerable distances during the Libya raid.

assisted intelligence evaluation, control”[20].g At that time-1969 was operating the Igloo White program, came close to making Westmoreland’s fut battlefield a reality [20, 32, 621.

Igloo White, designed to prevent the North Vietnam- ese from getting supplies to South Vietnam, was an electronic barrier erected across the Ho Chi Minh Trail in Laos. This barrier consisted of acoustic and seismic sensors that broadcast the sounds and vibrations caused by objects in their vicinity. Aircraft, including drones, patrolled the area and relayed signals from these sen- sors to the Infiltration Surveillance Center in Thailand where the information was analyzed. Locations of tar- gets were sent to command and control aircraft which would direct strikes, or to assessment officers in the Thai surveillance center who could themselves direct aircraft to strike targets.

Igloo White brings to the forefront the same issues we raised in the context of smart weapons. It allowed as- sessment officers to use force without having to con- front the consequences directly. It led to indiscriminate killing because seismic sensors cannot differentiate be- tween soldiers and farmers. In addition, it created an artificial reality in which people were reduced to blips on a display screen-blips that must be stopped, rather like a video arcade game.”

Research and development of battlefield automation and management has continued. Currently, the Army is using computers that assist in positioning artillery, directing fire, analyzing intelligence data, and conduct- ing electronic warfare. In 1985, the Army started the Army Command and Control System program to over- see the acquisition of computers and communications systems at all levels. The cost estimates reach nearly $20 billion over the next ten years [34, 591. Expert systems are now being developed to aid or automate vaguely defined command and control tasks, in particu- lar the “integration (fusion) of data” [l, 27, 40, 5O].‘l The SCP aims at developing battle management soft- ware for a carrier battle group, initially calling for a system to warn of “threats” and improved interfaces between man and machine. Recently, SCP also started work on implementing the Army’s AirLand battle doc- trine, currently planned for use in Europe and Korea. The implementation calls for integrating expert systems

‘Excerpts from General Westmoreland’s address to Associatiou of the U.S. Army. Oct. 14. 1969 (quoted in [XI]. p. 215-223). “Two asses~n~cnt officers commented on the CBS EU~IIIIIX Nws. 2F June 1971. “to break the boredom., to show everyone else that your equipment is working well. that you’re doing a good job. you want to fire right away. aud you’re not interested in discriminating because you arc not looking at what you‘re firing at.. You‘re essentially going to fire, and then-and try to stop the [blip]. There’s no chocking as to who pou’rc firing on hccause the curfew covers our conscience on that. WC have the curfew which says no- body. no civilian is allowed out in the countryside at night. and if you pick up somebody mowng at night, that’s it. There’s no checking.” [XL\

” TRW is developing an expert system for analyzing intclligencc data aud arriwng at the most likely conclusions in the presence of Lonllicting dale [I 1. Slagle and Hamburger [50] describe Bn/rl~,. an expert system to “plan. iote- grate. direct. and coordinate the fires of supporting arms.” III related work. expert systems are being developed for interpreting seismic and acoustic sig- nals. the latter for locating submarines for anti-submerino warferc 1401.

August 1989 Volume 32 Number 8 Communications of the ACM 943

Page 6: Computing, Research, and War: If Knowledge is ... - Potsdam › faculty › laddbc › Teaching › Ethics › ...Defense Science Board Task Force on the “Military Ap- plications

to plan maneuvers [15]. To .test new battle management concepts, the Rome

Air D’evelopment Center at Griffiss Air Force Base in New York established. a Battle Management Center to provide a “simulated ‘operational environment.” Early in 1966, the Center awarded five-year research con- tracts to a number of New York universities to investi- gate (distributed) problem solving, speech and image understanding, and plan recognition [4, 421. The Center is to serve as a test bed for battle management expert systerns and is expected to be used by the Strategic Defense Initiative Office as well [58].

In addition to expert systems, human experts will also be developed at such facilities. As former Lieuten- ant Glenera John Cus:hman points out [18]:

In modern air/land (or air/land/sea) warfare, there are no real experts because there has been no experience. . . . But through simulation we can have the experience of war-without the cost of war, and we can develop experts. When technical people can in a realistic and authentic battle simulation observe what succ’essful people do to gain their success, when they can ask questions of those (now) recognized “experts,” they can begin the arduous process of reducing the knowledge of these people to “rules.”

Using battle simulations to turn people into experts in warfare, building expert systems to duplicate this ex- pertise, and then perhaps using these expert systems in the next round of simulations is highly suspect. To date, successful expert systems have been developed in narrow, highly circ:umscribed domains for which there are recognizable experts. Even involving more than one expert in the process of engineering an expert system has led to difficulties, simply because individual ex- perts’ perspectives and motives differ. The Army may be no more successful in its efforts at developing useful expert systems, but in some cases at least their methods of development are better. For example, they plan to develop intelligence systems by starting with limited systems, followed by test and evaluation during actual use in the field [!%I.

In summary, a substantial effort to automate various aspects of battle at both the tactical and strategic levels is under way. As with smart weapons, we can ask to what extent a human decision maker is responsible for the consequences when force is advised by such a sys- tem. Again, responsibility for decisions lies not only with the users of a ba.ttle management expert system, but also with the system’s designers.

Automation of functions does not mean that things become any easier. Rather, the danger of being deluged by a “data monster” (tremendous amounts of collected information) is always present and “understanding what [the monster] is; trying to say is today’s major communications problem” [63]. Furthermore, a tenet of battle is that information is never reliable or complete and that decisions are made under uncertainty. The

expert systems in particular can give an illu.sion%&n- trol which may be unwarranted. To what extent can human decision makers be said to be in command of systems whose functioning becomes harder and harder to oversee and understand?

Leveson [31] discusses human-computer interactions that are necessary to control complex systems such as airplanes, air traffic, and industrial plants. In particular, she notes a trend toward humans supervising and mon- itoring computerized control systems, taking control only if the computer fails. One question that arises is

Computer simulations of warfare disfance users from the consequences of their acfions. Nowhere is fhis clearer than in strategic nuclear war simulations in which counterforce strategies, assumed to kill only I,OOO,OOO people, are compared with countervalue strategies, which are assumed to kill hundreds of millions of people and possibly bring civilization to a halt.

whether humans can be expected to take control in an emergency. To do so demands that operators be alert at all times so as not to be caught off guard [al, 461. This need for more active participation and awareness on the part of human controllers coupled with the fact that accidents are often caused by unanticipated events led Leveson to conclude that “it is doubtful that computers will be able to cope with emergencies as well as hu- mans can. The emphasis should be on providing the human operator with an operational environment and appropriate information that will allow intervention in a timely and correct manner.” As an important aside, Borning [12]” notes that human intervention has been essential in resolving false alarms of impending missile attacks on the United States. This is particularly rele- vant in the context of a strategic defense system that must respond automatically.

Focusing simply on technology misses organizations and procedures that embed technology within a much larger operational setting. The Cuban missile crisis in October 1962 underlined the difficulties of Icontrolling and directing a large, indeed global, system, especially when counteracting established procedures. A fascinat- ing analysis of factors (in particular, organiniational) that influenced the outcome of this crisis is given by Allison [6], who argues that the Navy implemented its estab-

944 Communicntions of th,e ACM August 1989 Volume 32 Number 8

Page 7: Computing, Research, and War: If Knowledge is ... - Potsdam › faculty › laddbc › Teaching › Ethics › ...Defense Science Board Task Force on the “Military Ap- plications

lished procedure for a blockade 500 miles from Cuba even though President Kennedy had ordered a naval blockade closer to Cuba to give the Soviets more time to consider their actions.

In closing, the reservations of Robert R. Everett, pres- ident of MITRE Corporation, a research organization involved in military command and contro1,‘3 should be taken very seriously:

[I am] troubled by the goal of replacing the most expert people, especially by trying not only to replace them but to improve on their perfor- mance. Human activities tend to be very com- plex. . . The suggestion that AI systems may somehow solve problems that we do not our- selves understand may come true in the far fu- ture but at the moment is both unreasonable and dangerous. People are useful; so are machines. Let us understand and provide for their separate roles.

Ironically, AI techniques, and expert systems in partic- ular, are being used in an attempt to improve our un- derstanding of political and military decision making, as we will explore in the next section.

WAR GAMING A war game is an attempt at a realistic dry run of war that allows players to explore and evaluate different strategies, plan operations, or to simply become more familiar with a particular war plan. War games have a long history: Go or Wei-Hai originated in 3000 BC, and it is believed that the Chinese invented chess almost four thousand years ago. More realistic war games us- ing relief maps, “tin soldiers” to represent infantry and cavalry units, an arbiter to decide battle outcome, etc., were introduced by von Reisswitz, a Prussian lieuten- ant, in the early 1800s [65]. Although simulations using human players can be quite realistic, they too have been considered prime targets for automation since they are time consuming and irreproducible.

A more quantitative approach to the planning of mili- tary operations was introduced during the Second World War. Military operations were viewed and treated as crude scientific experiments: operations were carefully planned, results measured and then fed back for plan adjustment.“’ Game.theory, the mathematical analysis of the behavior of rational players, arose at about the same time. A notorious game theoretic con- cept is the “prisoner’s dilemma” in which two pris- oners, unable to communicate with each other, are ac- cused of a crime. If both prisoners confess, both will be punished mildly; if neither confesses, both go free. However, if one remains silent while the other con-

l3 In B March 1984 letter to Dr. Joshua Lederberg. chair of the Defense Science Task Force on Military Applications of New-Generation Computing Technolo- gies 1181.

I4 Zuckermsn gives a fascinating account of the first application of the scien- tific method to opcrafions planning in From Apes In W~rlurds [67]. Before becoming Churchill’s science advisor during World War II. Zuckerman was a biologist. In 1964. he became Chief Scientific Advisor to the British Govern- men,.

fesses, the silent one receives sever the confessor is rewarded. What wou oner do? Assuming that each prisoner atte imize his exnected outcome. each will confe will be convicted. The prisoner’s dilemma has bw viewed as a model of the superpower stand-off; unable

Our culture seems to value improvement per se. While this is undoubtedly appropriate in many cases, improving weapons means improving your capability to slaughter people.

to trust each other and hence to cooperate, both end up worse off. Of course, these games are simplistic and fail to capture salient aspects of the real world, for exam- ple, that the superpowers do communicate to some ex- tent.

Since the Second World War, increasing computer use has allowed some convergence between these ap- proaches to analyzing the behavior of opposing nations. Computer programs are used to not only replace hu- man players, but also to process players’ actions faster and in a predictable manner. This speeds up a game and makes it more reproducible, although not necessar- ily more analytic or intelligible. Computer programs, especially large ones, are ideal hiding places for as- sumptions about the modeled situation. As Brewer and Shubik [13] note, players and more importantly policy makers tend to take results at face value without realiz- ing that whatever comes out of a computer analysis is based upon underlying assumptions. Of course this problem is not limited to programs modeling strategies and decision making, but applies to any large program. However, simulating strategic decision making differs from, say, simulating airflow around the wing of an airplane. The latter predictions can be verified empiri- cally, whereas the former cannot. Again, to talk about an expert system in the context of war gaming is mis- leading since there are no experts [17].

Commenting on his experience using AI techniques for war games at the RAND Corporation, Davis [17] exhibits a healthy skepticism of expert systems:

The transparency of individual rules in English- like computer code was misleading; it was very difficult to review and comprehend entire rule modules, in part because it was difficult to judge completeness. . . . There was an insidious side to the practice of heuristic modeling-in taking the pendulum swing away from rigorous quantitative modeling, there was a tendency to write ad hoc rules without appropriate structure or theory.

Similarly, Anderson [i’] observes that “there is nothing mystical or magical about AI, and the measure of our progress in exploiting the technology will be the speed with which the term AI is replaced by the more pedes-

August 1989 Volume 32 Number 8 Communications of the ACM 945

Page 8: Computing, Research, and War: If Knowledge is ... - Potsdam › faculty › laddbc › Teaching › Ethics › ...Defense Science Board Task Force on the “Military Ap- plications

trian Icomputer model or computer simulation-for in the end, Al is just a fancy name for a collection of sophisti- cated computer programming techniques.”

Unfortunately, skepticism is not the rule. Consider, for example, Schrodt’s [48] conclusion: “Symbolic and algorithmic tools such as pattern matching operating on massive historical event data bases may well prove ca- pable of finding regularities where numerical and alge- braic techniques operating on small sets of variables have failed. If a sturdy locked door fails to yield to the blows of a sledgeha!mmer, the solution is not necessar- ily a larger sledgeham.mer; picking the lock may be the more effective strategy.” Somehow pattern matching on massive databases (does not strike us as analogous to picking a lock.

As with the previous applications, computer simula- tions Iof warfare distance users from the consequences of their actions. Nowhere is this clearer than in stra- tegic nuclear war simulations in which counterforce strategies assumed to kill only l,OOO,OOO people are compared with countervalue strategies, which are as- sumed to kill hundreds of millions of people and possi- bly bring civilization to a halt. Training our policy mak- ers in the National Security Council or the Joint Chiefs of Staff or even in t.he White House on such simulations may have grave consequences if their choices are cir- cumscribed to “either a million or a hundred million dead.”

DISCUSSION Recently, we received an advertisement for the C31 Re- port, a biweekly business report on military “command, control, communications, and intelligence.” The adver- tisem’ent exhorts business to “get advice on where to concentrate your internal R&D; learn which areas will get the biggest budgets for advanced research” and to “watch for artificial intelligence applications in many programs, including SDI.” According to this advertise- ment, C31 is AI’s future application. Although we do not see th.e military as the only future AI application, we see it as the “hidden” application-the one that every- one in the field lives ,with but doesn’t talk about.

As responsible citizens not only of the United States but also of the world, we have a right and responsibility to investigate to what use our efforts are put. We must not only concern ourselves with acquiring and applying know-how, but we must also simultaneously inquire into the purpose and motivation behind the systems we develop [64]. In the military arena, for example, con- cerns about purposes and motivations often arise only after ;a technology has been developed and applied to a specific weapons program: “. . . it is not apparent whether the U.S. Navy’s cruise missile doctrine is being formed by the serendipity of the technological push or by the pull of clearly stated operational requirements submitted by the forces afloat” [ZS]. Zuckerman goes even further when he asserts that “today, military chiefs . . . merely serve as the channel through which the men in the laboratories transmit their views. . . . It is he, the technician, not the commander in the field, who starts the process of formulating the so-called mili- tary need. . . A new future with its anxieties was shaped by technologists, not because they were con- cerned with any visionary picture of how the world

should evolve, but because they we what they saw to be their job [68]. I5

We question the justifications and rati given by some policy makers. For exam that it is necessary to continue military development because it is necessary to know what is physically, biologically, chemically, computationally possible so as to be able to balance any new develop- ments in opposing nations. Clearly, knowing what your opponent can and cannot do increases your :sense of security, and might result in more realistic policies. More deeply, our culture seems to value improvement per se. While this is undoubtedly appropriatce in many cases, improving weapons means improving our capa- bility to slaughter people. Thus the neutron bomb and third-generation nuclear weapons in general. [54] are attempts at using the energy released by nuclear explo- sions more efficiently. In these cases, the qualification “improvement” is perverse, if not obscene. To return to the original argument for development, i.e., the need to know what is possible in principle, we would surely be more secure if certain weapons were not developed. Justifying the testing of nuclear-pumped X-ray lasers because we have to know whether the Soviets could develop them echoes the prisoner’s dilemma. It would be more sensible to try to cooperate and stem the de- velopment of these devices and feel secure because neither side has this “improved” capability instead of feeling “secure” because both sides have the same capa- bility.

We do not mean to downplay the complexities of these issues, but it seems that many decisions are made from a rather narrow perspective, and indeed derive their rationality from it. Stepping back and assuming a broader perspective leads one to question many deci- sions. Thus McNamara concludes that four decades of seemingly rational military and political decisions have led NATO to adopt an unacceptable first-use strategy that, if ever called upon, would at the very l.east de- stroy Europe [33].

If we expect other people, including politicians, to assume a broader and more long-term perspective, we should do the same. As computer professionals we should expand our concept of “job” to include obtaining and disseminating information about the broader rami- fications of our work. In conclusion, we remind the readers of the ACM Code of Professional Conduct, Canon 2, Ethical Consideration One: “An ACM member is encouraged to extend public knowledge, understand- ing, and appreciation of information processing, and to oppose any false or deceptive statements relating to in- formation processing of which he [sic] is aware.”

Acknowledgments. We wish to thank the Institute on Global Conflict and Cooperation of the University of California, the committee for Instructional Develop- ment at the University of California, Irvine, and the Department of Information and Computer Science at UC Irvine for their support.

‘“Zuckerman (68] relates that British scientists worked on decoys for the British Polaris and warheads for MlRVed Trident missiles wlthout prior gov- ernmental approval.

Page 9: Computing, Research, and War: If Knowledge is ... - Potsdam › faculty › laddbc › Teaching › Ethics › ...Defense Science Board Task Force on the “Military Ap- plications

REFERENCES 1. Advanced milifary compufing 1, 5 (1985), I. 2. 5-7. 2. Advanced military computing I, 6 (1985), 4. 3. Advanced milifary compufing 2, 9 (1986), 5. 4. Advanced military compufing 2, 10 (1986), 2. 5. Advanced military computing 2, 17 (1986). 3-4. 6. Allison, G.T. Essence of decision. Little, Brown and Co.. Boston, 1971. 7. Anderson, P.A. Using artificial intelligence to understand decision

making in foreign affairs: The problem of finding the right technol- ogy. In Arfificial Jnfelligence and National Securify, S.J. Cimhala, Ed., Lexington Books, Lexington, Mass, 1987.

8. Arkin, W.M. Tomahawk: Ominous new development. Bull. Atomic Scientists 40 (1984), 3-4.

9. Augarten. S. Bit by bit: An illusfrafed history of compufers. Ticknor & Fields, New York. 1984.

10. Badash. L.. Hirschfelder, J.O., and Broida, H.P., Eds. Reminiscences of Los Alamos, 1943-1945. D. Reidel Publishing Co.. Holland, 1980.

11. Barr, A., and Feigenbaum. E.A., Eds. The handbook of artificial infelli- gence. Kaufmann, Los Altos: Vol. 1,1981, Vol. 2,1982: P.R. Cohen, and E.A. Feigenbaum. Eds.. Vol. 3, 1982.

12. Borning, A. Computer system reliability and nuclear war. Commun. ACM 30.2 (Feb. 1987), 112-131.

13. Brewer, G.D., and Shubik, M. The War Game. Harvard University Press. Cambridge, 1979.

14. Charniak. E., and McDermott, D. Introduction fo Artificial Intelligence. Addison-Wesley. Menlo Park, 1985.

15. Corcoran, E. Strategic computing: Far from the finish line. IEEE Institute, December 1986.

16. DARPA, Strategic computing: New generation computing technology: A strategic plan for its development and application to critical problems in defense. Oct. 1983.

17. Davis, P.K. Applying artificial intelligence techniques to strategic- level gaming and simulation. In Modelling and Simulation Mefhodol- ogy in the Artificial Intelligence Era, MS. Elzas, T. 1. Oien , and B.P. Ziegler, Eds., Elsevier Science Publishers/North Holland. Amster- dam, 1986.

18. Defense Science Board Task Force, Military applications of new- generation computing technologies. December, 1984.

19. Dickey, A. Deep-sea robots cut their umbilical cords. The Engineer, 26+ (Sept. 11. 1986).

20. Dickson, P. The Electronic Battlefield. Indiana University Press. Bloomington, 1976.

21. Gerstenzang, J. Computers, lasers alter art of war. Los Angeles Times (Aug. 7, 1986).

22. Goodman, C.W.. Jr. US military RPV programs have taken big strides in 1986. Armed Forces 1. Inf., 66+ (Dee 1986).

23. Gurney. G. Rocket and Missile Technology, Franklin Watts, New York. 1964.

24. Hellman, P. The little airplane that could. Discover (Feb. 1987). 78-87.

25. Hum. M.. and Miller. D. Cruise missiles: Future ootions. In Proceed-

26.

27. 28.

29.

30.

31.

32.

33.

34.

35.

36.

37.

38.

39.

40. 41. 42.

in@ c$ the U.S. Nava;Jnsfitute. 112, 8 (19861, 49-53’. Hum, M.. and Miller. D. Cruise missile warfare. In Proceedings of the U.S. Naval Jnstitute. 111, 10 (1985), 96-101. IEEE Trans. Systems, Man, and Cybernetics. (Nov./Dee.), 1986. Kanade. T.. and Thorpe. C. CMU strategic computing vision project report: 1984 to 1985. Carnegie Mellon University. CMU-RI-86-Z. 1986. Klass. P.J. DARPA envisions new generation of machine intelligence technology. Aviation Week 6s Space Technology (22 April 1985). Lemmons. P. Autonomous weapons and human responsibility. BYTE 10, 1 (Jan. 1985). 6. Leveson, N.G. Software safety: What, why, and how. ACM Camp. Sum. 18 [1986), 125-163. Littauer, R. , and Uphoff, N.. Eds. The air war in Indochina. Beacon Press. Boston, 1972. McNamara, R.S. Blundering info disasfer. Pantheon Books, New York, 1986. Moore. M.K.. and Schemmer, B.F. Pinpointing targets, not real es- tate, electronically. Armed Forces I. Jnt. 124 (Oct. 1986). Navigation challenges autonomous vehicle. Aviation Week 0 Space Techtzology (22 April 1985). Nilsson, N.J. Principles of artificial intelligence. Tioga Publishing Co.. Palo Alto, 1980. Office of Technology Assessment, Information technology R&D: Crit- ical trends and issues. OTA-CIT-268, Feb 1985. Omstein. S.M., Smith, B.C. and Suchman, L.A. Strategic computing: An assessment. Cornmutt. ACM 28, 2 [Feb. 1985). 134-136. Parnas. D.L. Software aspects of strategic defense systems. Comnmn. ACM 28, 12 [Dec. 1985), 1326-1335. Patfern Recoguifion 18, 6, 1985. Perrow. C. Nornxzl accideufs. Basic Books, New York. 1984. RADC focuses on expert system applications for C3. natural speech technology. Aviation Week &Space Technology (22 April 1985). 84.

August 1989 Volume 32 Number 8 Communications of the ACM 947

43. Re agan. R. Address on ballistic missile d March 1983.

44. Redell. D., and Cohen, D. SDI: Is the softwa distributed by CPSR. CPSR, Palo Alto, CA,

45. Rich, E. Artificial intelligence. McGraw-Hill 46. Rochlin, G.I. High-reliability organizations and techn

sane ethical problems and dilemmas. ZEEE Technology and Magazine 5, 3 (Sept. 1986). 3-9.

-.

47. Rogers, M. Birth of the killer robots. Newsweek (25 June 1984). 48. Schrodt, P.A. Pattern matching, set prediction, and foreign policy

analysis. In Artificial Intelligence and National Security. S.J. Cimbala. Ed., Lexington, Mass., 1987.

49. Schutzer, D. Artificial intelligence. An applicafions oriented approach. Van Nostrand Reinhold, Co., New York, 1987.

50. Slagle, J.R., and Hamburger. H. An expert system for a resource allocation problem. Commun. ACM 28, 9 (Sept. 19851, 994-1004.

51. Stein, K.J. New strategic computing plan details programs, fiscal data. Aviation Week b Space Technology (15 December 1986).

52. Stevens, L. Artificial intelligence: A search for fhe perfect machine. Hayden Book Co., Hasbrouck Heights, N.J.. 1985.

53. Sun, M. The Pentagon’s ambitious computer plan. Science 222, (1983). 1213-1215. -

54.

55

56. 57.

58.

59.

60. 61.

Taylor. T.B. Third-generation nuclear weapons. Sri. Am. 256, 4 (1987), 30-39. Thompson, C. Military direction of academic CS research. Commun. ACM 29. 7 (July 19861, 583-585. Tsipis, K. Cruise missiles. Sci. Am. 236, 2 (1977), 20-29. Tsipis, K. The operational characteristics of ballistic missiles. In World Armaments and Disarmament, SIPRJ Yearbook 1984. Taylor & Francis. Philadelphia, 1984. USAF lab simulates battle management tasks. Aviation Week b Space Technology (9 December 1985), 105. U.S. General Accounting Office. Battlefield automation: Status of the Army Command and Control System Program. NSIAD-86s184FS. August, 1986. Walker. P.F. Precision-guided weapons. Sci. Ant. 245, 2 (1981). 37-45. Walker. P.F. Smart Weapons in naval warfare. Sci. Am. 248, 5 (1983). 53-61.

62.

63.

64.

65.

66.

67.

Weiss, G. Battle in control of the Ho Chi Minh Trail. Armed Forces J, 108,lZ (15 February 1971). 18-22. Weiss, G. Restraining the data monster: The next step in C3. Armed Forces 1. 108. 21 (5 July 19711, 26-29. Wiener. N. The human use of human beings. Cybernetics and society. Avon Books, New York, 1967. Wilson, A. The bomb and the compufer. Barrie and Rockliff Cresset Press, London, 1968. Winston. P.H. Artificial intelligence 2d ed.. Addison-Wesley, Menlo Park. 1984. Zuckerman. S. From apes to wnrlords. Harper & Row Publishers, New York. 1978.

68. Zuckerman. S. Nuclear illusion and reality. Collins. London, 1982. 69. Zusne. L. Visual percepfion ofform. Academic Press. New York. 1970.

CR Categories and Subject Descriptors: H.4.2 [Information Systems Applications]: Types of Systems-decision supporf; 1.2.0 [Artificial Intel- ligence]: General; J.7 [Computer Applications]: Computers in Other Sys- terns-milifary; K.4.1 (Computing and Society]: Public Policy Issues: K.7.m [The Computing Profession]: Miscellaneous-ethics

General Terms: Design, Performance Additional Key Words and Phrases: Artificial intelligence research,

battle management, smart weapons. social responsibility, war game sim- ulations

ABOUT THE AUTHORS:

JACK BEUSMANS is completing his Ph.D. degree in computer science specializing in human visual perception.

KAREN WIECKERT is completing her Ph.D. degree in com- puter science specializing in social theory of computing. Authors’ Present Address: Department of Information and Computer Science, University of California. Irvine, CA 92717; [email protected].; [email protected].

Permission to copy without fee all or part of this material is granted provided that the copies are not made or distributed for direct commer- cial advantage, the ACM copyright notice and the title of the publication and its date appear, and notice is given that copying is by permission of the Association for Computing Machinery. To copy otherwise. or to republish. requires a fee and/or specific permission.