"those who forget the lessons of history are doomed to repeat it": or, why i study the...

9
istory of Com JOHN A. N. LEE The year 1996 marks the 50th anniversary of the public revelation of the ENIAC, the 50th anniversary of the establishment of the Large Scale Comput- ing Subcommittee of the AlEE under the chairmanship of Charles Concordia, and the beginning of the 50th anniversary celebrations of the IEEE Computer Society and the ACM. Within those years, the computer field has not only de- veloped but has had added to it new concepts and ideas that have transmogri- fied it into an almosf unrecognizable entity. We are reaching the stage of de- velopment where each new generation of participants is unaware both of their overall technological ancestry and the history of the development of their spe- cialty, and have no past to build upon. Herein we look at the study of the his- tory of computing and its applicability to today’s technological challenges, and conclude with the recommendation that we need to know enough about our history to protect ourselves from it and not be condemned to repeat it, but also to use it to our advantage. uction e impact of the information revolution on our society and our industry is immense. In our increasing desire to control our own destinies, we seek to understand not only our contemporary technology, but also to look to the past to recognize trends that will allow us to predict some elements of the future. Looking backward to discover parallels and analogies to modem technology can pro- vide the basis for developing the standards by which we judge the viability and potential for a current or proposed activity. But we also have a feeling of responsibility for preserving the achievements of our forebears through the establishment of archives and museums, with the expectation that the pleasure of discovery will easily out- weigh the profitability of mere historical rumination. Although there had been a pride in the achievements of the for- mer years of technological development, it was in the mid-1970s when the American Federation of Information Processing Societies (AFIPS), with the foresight provided by Walter Carlson and the leadership of Henry Tropp, sponsored a project to preserve the his- tory of computing. In cooperation with the National Museum of American History, the formal study of our history was initiated. This study is now at the point where it is now a respected element of the field of the history of technology; our early researchers and scholars have been joined by several eminent historians, and the topic is beginning to produce young scholars who bring further prestige to the occupation. For the first time, the 1991 curriculum for computer science, developed by the IEEE Computer Society and the Associa- tion for Computing Machinery Joint Task Force, included explicit educational modules related to history in four specific areas: I. With apologies to George Santayana. 0 Artificial Intelligence, * Operating Systems, Programming Languages, and * Social, Ethical and Professional Issues History was NOT included in relation to algorithms and data structures, (computer) architecture, database and information retneval, human-computer communication, numerical and sym- bolic computation, and software methodology and engineering Some of these are quite surprising, the history of computer archi- tecture, for example, has been well researched and could provide a senes of case studies which would strongly enhance this module However the curriculum is not accoinpanied by guidelines and support materials related to the history of these subjects, and few teachers have the background, training or resources to effectively provide this instruction Common knowledge of the field is primarily populated with anecdotes and myths. The system of rewards and recognition in our academic institutions does not encourage teachers to develop their own materials in history, and, regrettably, there are few comprehensive textbooks to lead the way Moreover, there is a stigma that often accompa- nies the decision of scholars to study history after a career in science-a suspicion that they are becoming senile This is the same stigma that accompanies the move to the study of the edu- cational needs of computer science, or to studies of the social impact of computing, and concerns for the ethical issues in the field Yet the study of history is as intense as that of the study of 1058 6180/96/$5 00 D 1996 IEEE 54 0 IEEE Annals oj the History of Computing, Vol. IS, No. 2, 1996

Upload: jan

Post on 28-Feb-2017

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: "Those who forget the lessons of history are doomed to repeat it": or, Why I study the history of computing

istory of Com JOHN A. N . LEE

The year 1996 marks the 50th anniversary of the public revelation of the ENIAC, the 50th anniversary of the establishment of the Large Scale Comput- ing Subcommittee of the AlEE under the chairmanship of Charles Concordia, and the beginning of the 50th anniversary celebrations of the IEEE Computer Society and the ACM. Within those years, the computer field has not only de- veloped but has had added to it new concepts and ideas that have transmogri- fied it into an almosf unrecognizable entity. We are reaching the stage of de- velopment where each new generation of participants is unaware both of their overall technological ancestry and the history of the development of their spe- cialty, and have no past to build upon. Herein we look at the study of the his- tory of computing and its applicability to today’s technological challenges, and conclude with the recommendation that we need to know enough about our history to protect ourselves from it and not be condemned to repeat it, but also to use it to our advantage.

uction e impact of the information revolution on our society and our

industry is immense. In our increasing desire to control our own destinies, we seek to understand not only our contemporary technology, but also to look to the past to recognize trends that will allow us to predict some elements of the future. Looking backward to discover parallels and analogies to modem technology can pro- vide the basis for developing the standards by which we judge the viability and potential for a current or proposed activity. But we also have a feeling of responsibility for preserving the achievements of our forebears through the establishment of archives and museums, with the expectation that the pleasure of discovery will easily out- weigh the profitability of mere historical rumination.

Although there had been a pride in the achievements of the for- mer years of technological development, it was in the mid-1970s when the American Federation of Information Processing Societies (AFIPS), with the foresight provided by Walter Carlson and the leadership of Henry Tropp, sponsored a project to preserve the his- tory of computing. In cooperation with the National Museum of American History, the formal study of our history was initiated. This study is now at the point where it is now a respected element of the field of the history of technology; our early researchers and scholars have been joined by several eminent historians, and the topic is beginning to produce young scholars who bring further prestige to the occupation. For the first time, the 1991 curriculum for computer science, developed by the IEEE Computer Society and the Associa- tion for Computing Machinery Joint Task Force, included explicit educational modules related to history in four specific areas:

I . With apologies to George Santayana.

0 Artificial Intelligence, * Operating Systems,

Programming Languages, and * Social, Ethical and Professional Issues

History was NOT included in relation to algorithms and data structures, (computer) architecture, database and information retneval, human-computer communication, numerical and sym- bolic computation, and software methodology and engineering Some of these are quite surprising, the history of computer archi- tecture, for example, has been well researched and could provide a senes of case studies which would strongly enhance this module

However the curriculum is not accoinpanied by guidelines and support materials related to the history of these subjects, and few teachers have the background, training or resources to effectively provide this instruction Common knowledge of the field is primarily populated with anecdotes and myths. The system of rewards and recognition in our academic institutions does not encourage teachers to develop their own materials in history, and, regrettably, there are few comprehensive textbooks to lead the way Moreover, there is a stigma that often accompa- nies the decision of scholars to study history after a career in science-a suspicion that they are becoming senile This is the same stigma that accompanies the move to the study of the edu- cational needs of computer science, or to studies of the social impact of computing, and concerns for the ethical issues in the field Yet the study of history is as intense as that of the study of

1058 6180/96/$5 00 D 1996 IEEE 54 0 IEEE Annals oj the History of Computing, Vol. IS, No. 2, 1996

Page 2: "Those who forget the lessons of history are doomed to repeat it": or, Why I study the history of computing

the sciences and many of the techniques of searching and re- searching, deduction and derivation are applicable to both ac- tivities. The difference, if any, is that the historian needs a big- ger desk for the papers and more file space to store them. The techniques of historical research differ little from the techniques used in quality management reviews. In fact, the numbers of facts to be sifted through in order to reveal an event and to analyze the consequences is often far greater than in a “pure” science! Among the common problems are the resolution of conflicting memories, selective memory, and rewritten history. While the history of computing has not been the subject of the popularized culture of entertainment to the extent that (say) the history of the American west has been “improved,” the coloring of history by remembering only the best of times and adding fanciful scenarios on top of the facts is quickly overcome by careful review. Unfortunately anecdotal history is the quickest to be proffered, easy to record, and the most pleasant to read. Some criticism of oral histories [3] have pointed out that such activities are not simple to undertake if true histories are to be recorded.

Preservation is not history itself, but merely a prelude to the necessary stages of analysis and interpretation. Having decided to conduct a study of the history of computing, to preserve the re- mains of early projects and the ongoing history, and to analyze the events and activities of the pioneering efforts in computing, the question soon arises-why? Doing history can be fun and exhila- rating, but the kind of history that is the easiest is simply the re- cording and development of the chronology of inventions, events, and decisions. To be useful, history must be more than that. Toyn- bee’s clichC, “Those who forget the lessons of history are doomed to repeat it,” is an ambiguous concept. On the one hand, there are some lessons that need to be resurrected, repeated, and extended. There are others that, having been forgotten, ignored or put aside, are now in danger of being repeated. We need to know enough about our history not only to protect ourselves against it, but also to learn from it.

The Myths About Computer History To those of us who got involved in researching our history in

the 1970s, the initial impetus was both curiosity and the expecta- tion that what we were to do would be pleasurable. Our naivetC led us to believe that our data collection and preservation activi- ties constituted both the beginning and the end of the process of historical scholarship. At the time it seemed that our major task was to transfer the memories of living archives to (supposedly) a more permanent media. It was a pleasure to be able to talk to the pioneers and learn of their accomplishments, to have the means to preserve their stories and recognize their achievements. To many of our colleagues, history is only the study of an irrelevant past, with no redeeming modern value-a subject without useful schol- arship. In fact some pioneers themselves choose not to participate in preserving their stories, claiming that they were more interested in the future than the past, and that science progresses to over- come its past. To other historians the study of activities which happened as recently as fifteen years prior is not real history! For example, on the one occasion when I approached the Virginia Antiquities Commission to recruit their assistance in (literally) unearthing the remains of Howard Aiken’s Mark I11 computer, to

be rejected with the comment, “We don’t deal with anything more recent that the American Civil War!” Even a historian of technol- ogy commented the that, “Computer history is too recent to forget.” But such a statement does not take into account that the majority of the participants entered the field in the past decade, and until the late 1950s there were no journals and until the mid-1960s there were few textbooks which recorded the state of the art of the time.

there is more to “doing” history than simply the preservation of artifacts and

anecdotes.. . . Slowly we have realized that there is more to “doing” history

than simply thc prcscrvation of artifacts and anecdotes, and that we have the opportunity to “do” history in such a manner as to be useful today.

What Is History? The very first conference on the history of computing, held at

the Los Alamos National Laboratory in 1977 [13] was keynoted by the master of computer related adages-Richard Hamming-who entitled his talk, “We Would Know What They Thought When They Did It,” in which he pleaded for a history of computing that pursued the contextual development of ideas, rather than merely listing names, dates, and places and “firsts.” Hamming suggested that historians go beyond the published documentation and to speculate about those elements of the history of projects and events that were still unrecorded. He pointed out that what people actually did and what they thought they were doing did not al- ways find its way into the literature or records, and yet this is perhaps the most useful part of history.

Contemporary history is the activity that allows us to look be- hind us and to recognize, in their lifetime, our pioneers and their contributions. At the same time the delay between the instant of the innovation and the date of review allows us to view history from a reasonable distance-to increase our objectivity, to improve our ability to compare against parallel developments, and to rec- ognize the downstream impact on today’s technology.

In those early days these simple preservation activities pro- duced what we believe is best termed “chronological history,” since the results tended to be elementary lists of events with an emphasis on “firsts” such as have been frequently published in several journals and magazines on the occasion of a special anniversary [l, 7, 151. Starting with the first ACM SIGPLAN History of Programming Languages Conference [19] in 1978, two years later than the Los Alamos conference [13], a real effort was put forward to follow Hamming’s admonition. In this endeavor, the pioneers of 13 programming languages were asked to write papers which looked beyond the simple chronol- ogy of development [19]. The result was then much more than simply a listing of the events that led to the implementation of the language but also included the rationale for the decisions and a better feel for the environmental factors which influenced those decisions. In the main, the papers were still accounts of successes of innovation. The Second History of Programming Languages Conference in 1993 continued and extended on this beginning [lo] to produce a second set of papers on the pro-

IEEEAnnals ojthe History of Computing, Vol. 18, No. 2, 1996 * 55

Page 3: "Those who forget the lessons of history are doomed to repeat it": or, Why I study the history of computing

Those Who Forget The Lessons of History, or Why I Study the History of Computing

gramming languages of the 1970s and early 1980s. These ques- tion banks are not intended to create papers by merely collect- ing the answers; but they serve as prompts to incite memories about events that otherwise would be suppressed or ignored. Yet these are the very historical vignettes that serve to educate read- ers and inspire new concepts.

John Backus, in his acceptance speech on receiving the 1994 Charles Stark Draper Award, said:

In science and all creative work we fail-again and again. For each successful idea, we usually have dozens of others that don’t work, no matter how hard we stmggle. But in failing we learn a lot.

The teaching of science, and in most historical reporting, is often the teaching of successes; we shy away from learning about the failures because we believe that they are either uninteresting or time consuming. The baseline for understanding our science inflates each year and as we grow older we forget that the basis for the support of a currently perceived baseline is not always obvious. We, as teachers, know about a subject because we stubbed our toes on that rock years ago, but we have impatience for those who do not have the knowledge of that injury as a learning experience.

The study of contemporary history should have the advantage of being able to refer to the primary sources for information re- garding events of importance. These living archives tend to have faulty or selective memories, and, particularly in the area of soft- ware design, were not fastidious in documenting their achieve- ments. One of the problems is related to the clichC, “Victory has a hundred fathers but defeat is an orphan,” combined with our own predilection not to embarrass our forebears by highlighting their failures. Academically, there is a bias against publishing the re- sults of research activities which show that (say) a particular ap- proach to the solution of a problem cannot be achieved unless that observation is also accompanied by a formal proof. But an ex- periment which is unsuccessful is not reported thus leaving open the way for others to fall into the same pothole.

Failures may teach us as much as successes but it takes a par- ticular style of learning to appreciate and use that information appropriately. It is tantamount to the lessons learned and the theo- rems proved by the method of “contradiction.”

The preservation of artifacts is a questionable activity when it is intended that the objects will be simply deposited in a museum. Museums provide the field with two benefits-pportunities to interpret and educate, and to provide public relations occasions. Archives, on the other hand, provide a source of information for scholars, but it is not always necessary to handle an original arti- fact if it is well documented. The danger of preserving documen- tation rather than an object is that perhaps the documents assume too much, omit the obvious, and depend on a “hands-on” ap- proach for understanding. Recently there has been a proposal to create a software archive in the United States. Willis Ware, a pio- neer in his own right (being responsible for building the Johnniac computer), ruminated on Internet:

Software is unquestionably among the most important con- tributions of recent generations to the history of mankind.

Yet little thought or effort has been devoted to how it should be preserved...-if there is value in doing so.

I say that because it is not at all appaient just why one would want to archive software Most of it would have little value in the sense that books in a library do You can’t look at it, you can’t get access to its intellectual content easily Some of it would run only on certain equipment, but other than exercising the item, what would be the value in run- ning some old item of software? Is there something to be learnedresearched from the software, per se, versus its documentation, manuals and books on it? The answer is not at all obvious, I think

The question comes down to whether we can judge in advance whether there will be “value” in an artifact in the future Can we afford to throw away an object that currently i s (relatively) worthless, when we cannot foresee it’s future usefulness? How many of us regret having thrown away that 10 cent child’s comic book in the 1960s which is now worth $500, or the Barbie doll we got for our birthday?

The hlstory of computing is little different from the other areas of study whch we contemplate. There are few magic formulae to answer all quesaons and while it is no panacea to solve the prob- lems of the world, history may well provide keys to futurk devel- opments In his recent autobiography [17] George Stibitz ex- plained h s steps in planning for the development of a full-scale system following the construction of his “Model K ’ adder

I thought it wise to use experience as a guide and to base my plans on the history of computation (emphasis added by this author).

Simlarly Howard Aiken repeatedly recognized the contribu- tions of Charles Babbage in his papers, dedicating the first Har- vard Computer Center manual to Babbage. Clearly our own pio- neers knew of the power of history

What Do We Need History To Be? The History of Non-Successes- The Non-History?

Returning to &chard Hamming’s counsel, we must ask why it is important to understand the background of innovation and de- velopment In fact, little of our literature or open records contains information on the motivation, the innovation, and the decision making that led to commercial products, to new methodologies, and significant research results Personal interactions drive peo- ple; people make decisions; people make up organizations, but i n the literature it often appears that organizations somehow make decisions Read almost any copy of the current research literature, and one must conclude that ideas appear unsummoned, only one tram of thought led to enhancements, and the whole development of the topic proceeded without any side trips or dead ends ’We know from our own experiences that this i s patently not the case for anything more than the most trivial experiment Thus the study of history, which will most benefit our future actions, must ask questions which reveal the environment in which an innova-

2. Galeazzo Ciano (1903-1944), an Italian Fascist leader, according to the electronic edition of The Concise Columbia Dictionary of Quotations.

3 Posted on the SHOT History of Computlng LISTSERV, on Dec 1, 1993

56 0 IEEEAnnals of the History of Computing, Vol. 18, No. 2, 1996

Page 4: "Those who forget the lessons of history are doomed to repeat it": or, Why I study the history of computing

tion came about, the personal attitudes of the managers who insti- gated the project, the background of the participants, their prior experiences and their achievements, and the influences of the known state-of-the-art.

Business and sociological studies of computer enterprises can reveal practices and procedures that had an effect on the cultivation and extension of the industry. John Hendry investi- gated the negative impact of government policies on the fledg- ling computer industry in the United Kingdom following World War I1 when the state of exploration in the field was on a par with that in the United States. He entitled his book [9] Innovut- ing fur Failure. This study contains lessons that have applica- bility to the current vogue in state government to channel and encourage university research in the names of technology trans- fer and economic development. Similarly the history of the venture of Xerox into the computer Geld, taking over from SDC, and the failure to capitalize on the work of Alan Kay, Dan Ingalls, Robert Taylor, and others, is recorded in Fumbling the Future by Smith and Alexander [16].

Several CEOs have recently written their memoirs in which they attempt to justify their decisions in managing their respec- tive corporations but unfortunately their views are tainted by their perceived successes more than by their failures from which we might learn much more. So it is with the achievements of the laborers in those same organizations whose biographies are not likely to attain a “book-of-the-month” ranking. Their ideas and concepts which were previously rejected, perhaps because they were not capable of being implemented at the time, now may have a potential in the current day environment and are there- fore worthy of a revisit. John von Neumann’s ideas for the par- allelism of architecture were not capable of implementation in 1946; they are now being implemented. Other concepts for par- allel algorithms had no machine on which they could be realized until recently, and thus were cast aside waiting for the day of their resurrection. History will tell us where these garbage dumps of ideas have been covered over, and the methods of history will tell us how to excavate them again! Archeology has its place even in computer history.

Firsts are not always important, though there is obviously a common desire to be associated with firsts; we might reword the statement that, “Victory has a hundred fathers but defeat is an or- phan” to read “Firsts have a hundred fathers but seconds are or- phans.” But, in fact, though many may lay claims to firsts, firsts are often a matter of definition rather than chronology, for almost eve- rything we do can be considered a “first” for some minor distinc- tion-such as the location of the event, the first use of some element, and so on. Several firsts can occur almost simultaneously, each emerging from a primordial soup of technology which contains all the ingredients for the development of a new innovation. Unique firsts do have a place in the identification of the owners of intellec- tual property rights with respect to claims on patents, copyrights, and such. Everyone likes firsts but the attraction is for fame and fortune rather than downstream usefulness-firsts are better left to the Guinness Book of Records than being the subject of endless, meaningless arguments in scholarly journals.

Returning to our concern for what history can provide for the present and future, we must provide two major tools for our as- sistance-tools of analysis and interpretation. Unlike true science, history can be overloaded with facts, and one of the tasks of the

historian is to separate the perceived from the synthesized facts. Events are not always logically interrelated, and consequences of actions are not always obvious. Thus to apply logical analysis does not always provide the correct chronology of development, but analysis of the events and the environment can reveal nuances which are helpful for future decision making. Interpretation must follow analysis, though some measure of interpretation can be provided by pioneers, but we must realize that this interpretation is biased by personal viewpoints and follow-on beliefs.

What Have We Learned Already? Many studies of the history of computing have primarily been

associated with the organization and resulting publications of conferences. The Los Alamos conference gets the credit as the first, but since that time (1 976) history conferences have occurred approximately once a year throughout Europe and the U.S. The American Federation of Information Processing Societies organ- ized a series of Pioneer Days in association with the annual Na- tional Computer Conferences. Pioneer Days were primarily asso- ciated with anniversaries and provided a period in which the topic

If I have seen a little further than others it is because I have stood on the

shoulders of giants.

could be intensely reviewed, revisited and rejuvenated. In fact, Pioneer Days added little to the known history of the events being celebrated but they did provide an opportunity to collect much of the information into one place (such as the Annals of the History of Computing) and to allow the pioneers themselves to add their comments on the happenings from a distant viewpoint.

Since 1980 there has been a growing library of books directly related to the history of computing, but the market has not been large compared to other texts, and it is rare that a book on the history of computing has seen a second printing or even a second edition. Publishers have been reluctant to expend the necessary support for the up-front cost of printing and binding the initial print run. While several books have related to specific topics and resulted from conferences on the same topic, there have been few texts which provided a comprehensive overview of the topic. Michael William’s The History of Computer Technology [20] was extremely popular, but the lack of a second printing has stifled the offering of university and college courses on the history of computing. Stan Augarten’s Bit by Bit [4] seems to have suffered the same fate.

The one publication which has stood the test of time has been the Annuls of the History of Computing. Originally sponsored by AFIPS Press, the Annals has been actually printed and distributed by four different publishers over the past 15 years, and is hope- fully now well established within the IEEE Computer Society. The 17 years of quarterly publications have provided the sub- scribers with insights into a wide range of subjects from early calculating devices such as Napier’s promptuary to the sociology of state-of-the-art supercomputers. Annals has suffered from a strong orientation toward the English speaking world, though there has been a greater attention to the history of computing elsewhere in the past seven years. The history of computing in France was included in five separate issues, and one special issue

IEEEAnnals of the History of Computing, Vol. 18, No. 2, 1996 57

Page 5: "Those who forget the lessons of history are doomed to repeat it": or, Why I study the history of computing

Those Who Forget The Lessons of History, or Why I Study the History of Computing

was devoted to the history of computing in Canada. There have been special issues related to machines (primarily B M products) and academic institutions. Individual papers have examined cer- tain aspects of national computing activities, such as a recent ex- tensive paper on the former USSR. Also the Annals has become the primary journal for the publication of papers relating to Cha- les Babbage, his engines, and his other innovations.

Computer science4 is somewhat unique in that 30 years ago we could start from first principles and give the student everything there was to know about computing. We could show students the best route to a solution for a problem; there were very few alternative paths and the wrong paths were quickly obvious. Is there a differ- ence between “working from first principles” and taking a “historical approach?” To some extent there is a significant differ- ence, but in today’s teaching environment when learners yearn for a multimedia approach, the association of ideas or inventions with real people and events is a useful teaching tool. Remember the Newtonian phrase:

If I have seen a little further than others it is because I have stood on the shoulders of giants.

Too often we fail to look down and see what those giants had to go through to achieve their gianthood. For anyone who has been in the field for more than a few years, the lessons learned are the lessons not of documentation, but of experience. Experience, that is, of having made mistakes, having made the wrong deci- sions, and occasionally of having achieved the hoped for goal. James Horning said of the relationship between experience and mistakes:

Good Judgment Comes from Experience Experience Comes from Bad Judgment

Disciplines are driven forward by the recognition of key con- cepts, or key products, which set the example for further devel- opment, give a sense of achievability, a set of shoulders to build upon. In Computer Science we can identify “milestones” which set those examples:

6 ENIAC, UNIVAC, Systed360-hardware 0 Fortran, Algol, Basic-programming languages 0 IBSYS, CTSS, Multics, Unix-operating systems 0 General Problem Solver (GPS), ELIZA, DENDRAL-

Artificial Intelligence 0 Alto, Altair, Macintosh-personal computers

Just in the same way that we have observed that the “early” computers each derived from the common level of knowledge and achievement in electronics and physics of the 1930s, and that it is impossible to identify one single development which triggered the development of the computer, so it is with many of the major cate- gories of advances in the field. From the need to direct the actions of the computer by some better means than rewiring the components came the concept of programming, and from the paucity of expres- sion of “machine codes” came the need for programming languages. Thus many of the developments in computing explode, like an atomic reaction, from a state-of-the-art which has sufficient composition to provide all (but perhaps one) of the ingredients necessary for the next step of development. Thus examining the history of computing re-

4. It is, of course, an example of Whiggism to talk about computer sci- ence in 1960 - the concept may have been there, but not the term!

58 6 IEEE Annals of the History of Computing, Vol. 18, No. 2, 1996

quires not just the following of a single line of inquiry. The overall technical environment provides the ingredients whle the human re- source environment provides the impetus to take the next step.

Mahoney [12] retells a story, perhaps apocryphal, about Jean Piaget. The child psychologist was standing outside one evening with a group of 11-year-olds and called their attention to the newly risen moon, pointing out that it was appreciably higher in the sky than it had been at the same time the night before and wondering out loud why that was. The children were also puzzled, though in their case genuinely so. In his practiced way, Piaget led them to discover the relative motions of the earth, moon, and sun and thus to arrive at their own explanation. A month or two later, the same group was together under similar circumstances, and Piaget again posed his question. “That’s easy to explain,” said one boy, who proceeded to sketch out the motions that accounted for the phenomenon. “That’s remarkable,” said Piaget, “How did you know that?’ “Oh,” the boy replied, “we’ve always known that!”

Not only children, but people in general, and scientists in par- ticular, quickly forget what it was like not to know what they now know. That is, once you have solved a problem, especially when the solution involves a new approach, it is difficult to think about the problem in the old way. What was once unknown has become obvious. What once tested the ingenuity of the shlled practitioner is now “an exercise left to the student.”

Possessing a solution can mask the original problem in several ways. One may forget there was a problem at all, or undervalue the urgency it had, projecting its cufrent insignificance back to a time when it was not trifling at all, but rather a serious concern. One may reconstruct a different solution, overlooking the re- structuring of the subject which was brought about by the initial solution. One may ignore or undervalue alternative solutions that once loolied attractive and could very well have taken develop- ment in a different direction. Historical review provides the op- portunity, along with 20120 hindsight, to examine the “what-if?” scenarios in the light of current technology and their new applica- bility. One of the advantages of this review is that there are no longer right or wrong answers to the original questions. Histori- cally, a “right” answer requires just as much explanation as a “wrong” answer, and both answers are equally interesting-and equally important.

What Can We Do With Historical Information?

We have already emphasized the prospects for recovering lost opportunihes through historical review, but one of the other bene- fits of retrospection is the occaslon to identify the fundamentals of concepts whch have now become perceptually complex

An examination of many of the central concepts of science from birth to maturity shows a perceived complexity versus time curve which is roughly parabolic In the beginning, the initial idea is often expressed in such complex terms that it is overlooked by the majority of read’ers The one or two who can see through the veil of complexity simplify the concept, and express it in under- standable terms, that in computer science can lead to implementa- tion and application As the resulting system is used and explored, it is again modified and enhanced to achieve a new zenith of complexity in that the level of applicability is high while the de- gree of understanding (seen as the complement of perceived com-

Page 6: "Those who forget the lessons of history are doomed to repeat it": or, Why I study the history of computing

plexity) has dropped once again to a low setting. Revealing a basic concept, understanding innovation, and watching increasing obscuration are not only part of the benefits of the review of an extended project, but also the understanding of product develop- ment, some aspects of which are to be avoided.

Perceived 1 Complexity 1 .

original 4 - I ? concept I I extensions

I - enhancements

I -

implementation c

We have many examples of this phenomenon in computer science and we are sure that similar examples can be found in other sciences. The phenomenon of increasing perceived complexity in the latter stages of development creates a barrier against the overall study and presentation of an apparently complex concept. To better understand the concept we should return to the nadir of the complexity curve to work outwards to comprehend the source of the concept and its evolution.

A simple example was given to me by one of the early re- viewers of this paper. His daughter, a computer science graduate following in her father’s shoes, had joined a small company in the Northeastern United States alongside graduates of three other institutions. One of their first tasks was to develop what amounted to a lexical analyzer, but not having access to a Unix system containing a standardized analyzer, LEX, the other three were stumped. Having access to an education that included studies of compiler technology prior to the implementation of YACC (Yet Another Compiler Compiler) and LEX, my friend’s daughter was able to solve her employer’s problem. An extreme example perhaps, but like other sciences, computer science edu- cation requires studies of the fundamentals of our field on which to build and deduce additional concepts.

In 1981, during the study of the development of the first FOR- TRANSIT compiler in preparation for the celebration of the 25th anniversary of the release of the first Fortran compiler, we were reminded of a long-lost element of the technique of arithmetic expression translation. One method of analysis involves fully parenthesizing the infix expressions by surrounding each operator and its associated, possibly compound, operands with a parenthe- sis pair. This form of expression does not require any knowledge of the hierarchical relationships between contiguous operator- operand groups. Any parenthetical group that contains no other parenthetical groups can be evaluated (or compiled). Conse- quently that group can be replaced by the computed value or, in the case of compilation, by the location of the result, thus reveal- ing a new “computable” group. One other possible result obtainable from a fully parenthesized expression is a prefix form of the expres- sion, which coincidentally is parenthesis free. The methodology

consists simply of surrounding operators with a number of back to back parentheses in an inverse proportion to the hierarchy of the operator, and assuming that there are an unbounded number of pa- rentheses on each end of the expression. This simple textual re- placement results in an expression which is fully parenthesized- except for sequences of operators of like strength, which would be evaluated (or compiled) in a strict left to right ordering. Today we rely on much more complex solutions to simple problems!

History As A Set of Case Studies John von Neumann has been blamed by John Backus, creator

of Fortran. for the “bottleneck” in computing and the restrictions of the so-called von Neumann machine. Backus’ Turing lecture [SI in 1978, in which he introduced his concepts of functional programming (FP), was entitled “Can Programming be Liberated From the von Neumann Style?” In fact, Backus’ understanding of the contributions of vonNeumann was in error. John von Neumann recognized the advantages of parallel computation,

Thus the questions ... should be better stated in terms of placing the teaching

of computer science in the environment of historical scholarship.

but he also recognized that in his time the technology of architec- tural design was not sufficient to support truly parallel systems, that programming had not developed the techniques of multipro- gramming. Thus to solve the problems of the day he accepted serial or sequential computation-the very technology for which he is now blamed. In fact von Neumann traveled the complexity curve from its initial high point to a practical, implementable minimum. To complicate our understanding of the development of the computer, John von Neumann allowed a myth to build around the origination of the concept of the stored program. The so-called EDVAC report of the work of several scientists was never com- pleted and the first draft [18], which only ascribed authorship to vonNeumann, was taken to be the original source of the idea. Over the years, von Neumann never took the time to dissuade his creditors of the actual genesis of the stored program concept. This was discussed by several computer pioneers at the 1982 National Computer Conference [2] in Houston, Texas, who added the un- derstanding that like other “firsts” this one too had several fathers. In fact, Konrad Zuse has recently suggested [15] that the Z-1 computer contained a stored program in 1937. Similarly, Grace Murray Hopper who died just a few years ago, was credited by some writers of her many obituaries with the “development of the programming language Cobol.” History clearly shows that she would better be credited as the grandmother-as the originator of FLOWMATIC, as the midwife-as a member of the organizing committee that recommended the evolution of a business oriented programming language, or as the mentor-in her capacity as one who promulgated the language and possibly instigated the writing of the first two compilers. Over the years, Dr. Hopper never took any steps to dissuade believers of this misplaced credit, but nei- ther did she personally claim the credit for the language. John Mauchly claimed credit for the invention of the computer, and for its constructian in conjunction with J. Presper Eckert. Yet in 1973,

IEEEAnnals of the History of Computing, Vol. 18, No. 2, 1996 59

Page 7: "Those who forget the lessons of history are doomed to repeat it": or, Why I study the history of computing

Those Who Forget The Lessons of History, or Why I Study the History of Computing

the patents were invalidated by the Minneapolis District Court and “one, John Vincent Atanasoff” was, by law, given the credit for the invention of the computer-at least in the U.S. Clark Mul- lenhoff [14] went so far as to accuse Mauchly of perjury on the witness stand. Mauchly’s widow, Kay Mauchly, has, subsequent to the trial, discovered pieces of early electronic devices that her husband built prior to his visit to Atanasoff’s home in 1941. But we are now aware that there was similar work going on in Eng- land by Alan Turing and in Germany by Konrad Znse. After the unveiling of the ENIAC, the University of Pennsylvania asked J. Presper Eckert and John Mauchly to sign over their intellectual property rights to the university. The resulting disagreement led to the departure of Eckert and Mauchly from the university and the subsequent creation of the first electronic computer company.

One of the concerns of today’s education is the integration of the ethical concerns into our curriculum. From our three examples above, and many others that we can find in our history, we have a set of case studies from our own profession for students which have a direct bearing on ethical studies in computer science.

Martin Campbell-Kelly [6] recently presented a paper in which he examined the impact of three different revolutions on the Brit- ish census from 1801 through 1911. These revolutions he classi- fied as the “Victorian Revolutions”-Victorian Data Processing, Office Mechanization, and Office Automation. He showed that throughout the three stages of applying technology to the prob- lems of counting the census, even in the face of increasing com- plexity of questionnaires and even greater increases in population, the actual cost per capita changed very little, while the number of employees necessary to complete the processing increased in proportion to the population, independent of the insertion of tech- nology. The early censuses were counted by hand; the first revo- lution included the use of calculators to summarize the results; the second revolution involved the centralization of the counting sys- tems and the application of the methods of the division of labor according to capabilities (later called the assembly line approach), and the last revolution in the study period resulted from the use of card processing and tabulating equipment. It is interesting that although many of the Census Bureaus of the world converted to card processing in the 1890s, Great Britain waited for almost 20 years, and took advantage of the earlier experiences and did not themselves experience the gigantic step increase in cost that had occurred in the United States. What may we learn from such a study? We might conclude that in considering the application of technology to an industry, there are three stages of impact:

0 primarily technology provides a “better” way of doing the same task-the possible direct replacement of human opera- tors with no increase in productivity;

* secondly technology provides a faster means of doing the same task, increasing the throughput of the overall system, creating a time compression, while retaining the current staffing (perhaps with the need for some retraining);

0 and finally technology provides the capabilities of not only solving current problems but also provides capabilities that were previously beyond that of the prior system, without compromising quality.

Does the study of the censuses of the 1800s have an application to the census today? Is there a comparison with the application of the computer to the census in 1951 when the first UNIVAC was

delivered to the U.S Census Bureau? Is there a comparison with the apphcation of technology to other industries either in the 1800s or today? Perhaps we need more data, but we can see simlarities with the applicabon of the computer to modem society Are these three charactenstics of technology inse&on also applicable to the inser- tion of the computer into modern systems? Can we see what is needed in the modem system to ensure that there is greater produc- tivity with no loss of jobs, in a market environment which can ac- commodate improved quality and quantity of products? Two exam- ples in the communications arena come immediately to mind-the automation of the telephone system (and the resultant saving of the entire female work force from the penury of serving as opeiators) and the dispatchmg of taxi cabs The application of automation to both processes can be seen to have traversed these three stages From this model we can predict the impact of automaaon on new applications, and consider the ethical implications of this work

Ethical considerations are a primary consideration in the report by Leveson and Clark [l 11 on the accidents related to Therac-25 radiation therapy system This investigation, while dealing only with activities within the past 15 years, is typical of the kind o l historical research which has positive downstream impact Leveson and Clark‘s analysis of the software errors provides a number of lessons in software development which reinforce the many lessons which have been preached in software engineering classes for a decade-unfortunately the decade after the installa- tion of the Therac-25 equipment Undoubtedly the lessons learned from such errors (and the resulting human suffering) is much stronger than those of positivism

We can develop a model of development and ?en apply tliat model to new concepts For example the development of com- puter science in periods

0 the emergence of the concept (1940s) 0 the emergence of a set of related but distinct disciplines

around a central theme (1950s) 0 a multidisciplinary approach (1960s)

interdisciplinary integration (1970s) * specialization and possible fragmentation (1980s)

We can see that same sequence in other elements of computer science (part of the fragmentation) such as human computer mter- action

* the emergence of the concept (mid-1970s) 0 the emergence of a set of related but distinct disciplines

around a central theme (late 1970s) a multidisciplinary approach (1980s)

0 interdisciplinary integration (1990s) a specialization and possible fragmentation (the future?)

We can also see the emergence of science in the progression from basic research, presentations in graduate seminars, applica- tions, occasional courses, graduate programs, and finally under- graduate courses and specialties (and standardization somewhere in the sequence)

Insertion of computer technology into

0 insurance industry 0 banking 0 hotels 0 airlines 0 (other service industries)

60 e IEEEAnnals of the History of Computing, Vol. 18, NO. 2, 1996

Page 8: "Those who forget the lessons of history are doomed to repeat it": or, Why I study the history of computing

Compare insertion of other technologies

speech, writing, printing telephone steam engine

History as a Teaching Tool Is history a way of teaching computer science? Is history a part

of teaching computer science? Is there an advantage to going that far back? In fact all teaching is the teaching of history-or at least the retelling of events and discoveries of both the recent and the far past, but not always in the context of history. Leaming is the understanding of discoveries and the application of self-discovery techniques to personally unexplored fields. Research starts with the study of the history of a topic and attempts to build new lines of discovery or development. Thus the questions above should be better stated in terms of placing the teaching of computer science in the environment of historical scholarship. The answers to the questions above are most likely in the negative, but the prepara- tion for teaching and the management of learning can take ad- vantage of knowledge of the perceived complexity curve to select the appropriate point for creating a beginning point for under- standing. Other artifacts from history can assist in the under- standing of simple concepts by providing a model or machine through which to learn abstract ideas. These concepts may be the easiest to insert into the current environment and thus may appear to be the greatest immediate benefit of history. My favorite is Napier’s chessboard calculator. While many have learned of Napier’s book entitled Rabdologia which contained the descrip- tion of the multiplication “bones,” few learned of the two other machines which he described-the promptuary and the chessboard calculator. The promtuary, being an enhancement of the bones, is of lesser interest. However the chessboard calculaator, which supports a wider range of arithmetic operations, is simpler to con- struct. The shortcoming of the chessboard calculator in times past may have been its dependence on the binary representation of the operands and Napier’s use of letters to represent number place values rather than numeric values; today those problems should be of lesser concern. While the chessboard calculator does not have any historical significance, past the fact that it was devel- oped by John Napier in the early 1600s, it docs have an instmc- tional facility which is useful in the teaching of binary number systems. The calculator clearly displays the problems of carry and borrow in addition and subtraction, respectively, and provides a clever model for multiplication and division. History can provide a backdrop for the study of a series of abstract machines starting from Napier’s machines, Oughtred’s slide rule, Pascal’s adder, Babbage’s engines, Hollerith’s tabulators, Turing’s universal ma- chine, to Stibitz’ relay machines, and Wilkes’ original RISC ma- chine-the EDSAC. Several educators have developed emulators for several early systems which can provide an excellent intro- duction to the study of computer systems and architecture.

What can we learn from “old” machines? basics-early RISC systems problems before new tools were developed techniques of implementation-e.g., floating point arithmetic appreciation of current technology an uncomplicated concept that can be analyzed

Including historical perspectives in commonly taught computer science courses can bring some realism to the topic and at the same time provide some relief from the stultifying effects of purely abstruse presentations. For science students there is an additional advantage to studying history-the opportunity to de- velop other skills such as writing and presentations. The study of history also provides opportunities for interdisciplinary activities matching science students with those in the humanities and the social sciences. Rather than using history merely as an old story to be told, the history of computing has elements which can be used effectively to supplement and support the pedagogical aspects of the field. Starting too far along the development curve deprives the student of the understanding that concepts do not emerge full blown. Starting early on the development curve, and at the nadir of the perceived complexity curve, strips away the numbing, hu- bristic impact of enhancements and extensions.

Old systems become museum pieces or are handed down to the “less

fortunate”-to the biologists and the educators.

One danger remains-the judging of past events in the light of today’s knowledge rather the background and technological envi- ronment of the day-an attitude we call “Whiggism” versus how old technology fits into today’s environment.

Conclusions What makes history? It is a matter of going back to the

sources, of reading them in relation to their local environment, and of thinking one’s way back into the problems and solutions as they looked then. It involves the imaginative exercise of sus- pending one’s knowledge of how things turned out to recreate the possibilities that were still open at that time. In The Go-Between [8], Leslie Hartley remarked that “The past is a foreign country; they do things differently there.” The historian must learn to be an observant tourist, alert to the differences that lie behind what seems familiar, but always looking for the newly applicable.

Our current scientific understanding of the field of computing is incomplete; our ability to engineer reliable products is also incomplete, even though our pioneers started out by proving the correctness of their creations. Like the kayaker, we sit bobbing on the edge of an eddy, on the one hand, ready to be swept down- stream to continue the fight against the flow of nature, and on the other hand, straining to achieve the chance to collect our thoughts in the still waters of understanding. This oscillation between pure and applied science swirls us around, the most effective progres- sion downstream being accomplished by conserving our energy and analyzing the last descent, before we tackle the next obstacle. The technolust in our profession today discards the immediate past in favor of the latest toy; we do not pause long enough either to enjoy the tranquillity or to understand what we have accom- plished. A well mounted hype that builds desire for glossy new artifacts has begun to win over reason and planning. As the amount of data left behind decreases, our ability to analyze the ongoing computing experiment and the problem grows more and more difficult. What really did work and what did not? Is speed

IEEEAnnals of the History of Computing, Vol. 18, No. 2, 1996 0 61

Page 9: "Those who forget the lessons of history are doomed to repeat it": or, Why I study the history of computing

Those Who Forget The Lessons of History, or Why I Study the History of Computing

taking the place of ingenuity? Is faster so much better? The an- nouncement of a new system in one of the glossy personal com- puter magazines suggests that all prior models are now to be viewed with coiitempt and to be regarded as being superfluous. Old systems become museum pieces or are handed down to the “less fortunate”-to the biologists and the educators.

In the next 10 years historians of computing need to develop a methodology and historiography that is pertinent to the field, that will provide it with integrity and scholarship to enable it to justify its place within the larger field of the history of technology. The history of computing must begin to show usefulness and applica- bility to the modern world to begin to develop inroads into the field which it purports to investigate. Like software engineering in the 1970s, the history of computing needs to become formalized, canonized and sanctified. Institutions such as the Charles Babbage Institute, the Computer Museum, the Deutsches Museum, the Science Museum, the Smithsonian Institution, and other organi- zations devoted to preserving and analyzing the history of com- puting must take the lead in this endeavor. At the same time we must encourage public and private granting organizations to rec- ognize the benefits to be gained from contemporary historical research and to support the production of new researchers and teachers.

There is a place for historians in computer science and the study of historical elements of the field by students, at all levels, is legitimate. The historian cries out to the fleeing back of a com- puter scientist “pause to smell the flowers!”

w ~ e d ~ ~ e ~ t s My many thanks to my several colleagues who have reviewed

and commented on these thoughts, though that is neither to claim that they agree with the content totally nor that I have followed their suggestions rigorously. Also sincere thanks to Michael Ma- honey, Princeton University, for his permission to quote from his article “What Makes History?’ that we have used as a primary reference for prospective authors to the Annals for several years.

eferences ACM, A Quarter-Century View. ACM, New York, 1971. William F. Aspray, “Pioneer Day, NCC ‘82: History of the Stored Program Concept,” Annals of the Hisfory of Computing, vol. 4, No.4. pp. 358ff, 1982. William Aspray, “From IEEE’s Perspective,” Annals of the Histor) qfConzputing, vol. 15, no. 1, pp. 5-6, 1993. Stan Augarten, Bit by Bit: An Illustrated History of Conzputers. New York: Ticknor & Fields, 1984. John W. Backus, “Can Programming be Liberated From the vonNeumann Style? A Functional Style and Its Algebra o f Pro- grams,” Communications of the ACM, vol. 21, pp. 613-641, 1978. Martin Campbell-Kelly, “Information Revolutions and the British Census, 1801.19 11,” Trois ihe Colloqiie Histoire de I’lnformatique, Sophia-Antipolis, France, Oct. 13-15, 1993:- “The Computer Age,” Computerworld, pp. 14-18, Nov. 3 1986. Leslie Poles Hartley, The go-between. New York: Knopf, 1954. John Hendq, Innovatingfor Failure, Cambridge Mass.: MlT Press, 1989. J.A.N. Lee ed., “Guidelines for the Documentation of Segments of the History of Computing,” Annals of the History of Computing, vol. 13, no. 1, pp. 51-62, 1991. Nancy G. Leveson and Clark S. Turner, “An Investigation of Therac- 25 Accidents,” IEEE Computer, pp. 18-41, July 1993. Michael S . Mahoney, “What Makes History?,” The Second History of Progranznzing Languages Conf (HOPL-II), ACM SIGPLAN No- tices, vol. 28, no. 3, pp. x-xii, March 1993.

62 * IEEE Annals of the History of Computing, Vol. 18, No. 2, 1996

[13] N Metropolis, J Howlett, and Gian-Carlo Rota, A History of Conzput- rng in the Twentieth Century New York Academic Press, Inc , 1980

[ 141 Clark R Mullenhoff, Atanasoff Forgotten Father of the\Conzputer Iowa State University Press, 1988

[ l j ] Linda Runyan, “40 Years on the Frontier,” Datamation, pp 34-57, March 15 1991

[16] D K Smith and R C Alexander, Fumbling the Future, How Xerox Invented, Then Ignored, the First Personal Coinputer New York William Morrow & C O , 1988

[ 171 George R Stibitz, The Zeroth Generation, Private Printmg, 1993 [18] John vonNeumann, “The Fiist Draft of a Report on EDVAC,” 1947,

repnnted in Annals of the History of Computing, vol 15, no 4, 1993 [ 191 R L Wexelblat ed , History of Programming Languages New York

Academic Press, 1981 [20] Michael R Williams, A History of Computing Technology Engle-

Prentice-Hall, 1985 wood Cliffs, N.J

J A N . Lee is the former editor-in-chief oP Annals, and is vice president for member- ship of the IEEE Computer Society He has been a professor of computer science at Virginia Tech since 1974 and with the Center for the Study of Science in Society since 1990 He served as the vice-president of the ACM from 1984 to 1986 Dr Lee can be reached at the Department of Computer Science Virginia Tech Blacksburg, VA 24061-0106 540 231 5780, E-mail janlee@cs vt edu His web page is URL http //el cs vt edu/-janlee/Janke html