innovation and the library: the adoption of new ideas in public libraries

7
Book Reviews Video Dialtone Technology: Digital Video over ADSL. HFC. FTTC, and ATM. Daniel Minoli. New York, NY: McGraw- Hill: 1995: 495 pp. Price: $60.00. (ISBN 0-07-042724-O.) The book is about the delivery of interactive video services from video information providers to the end users. The book explores various technologies and technical issues in fair amount of detail and. in addition, discusses the surrounding regulatory and marketing aspects. The 15 chapters include some that are of tutorial nature, containing overviews of cable television technology, video compression, and asynchronous transfer mode (ATM). These chapters provide excellent back- ground information to the unfamiliar reader for the under- standing of the core chapters, which cover the video dial tone technology alternatives, their comparative assessment and is- sues that need resolution. The first chapter provides a broad overview of the entire field of video dial tone (VDT), summarizing a wide spectrum of topics ranging from technical, regulatory and market aspects to service development. Chapters that follow examine these in greater detail and depth. The chapter describes the role that video plays as a key driver of the Information Superhighway and how the new video delivery systems differ from the tradi- tional video broadcast networks. It discusses the convergence of the traditional TV set and the home PC into a single device, providing much greater interaction with the user than the pres- ent day TV. The chapter also contains overviews of traditional video distribution, VDT network views from the perspectives of the cable TV companies and telephone companies, techno- logical options for physical distribution of video signals, signal- ing and control, and VDT applications. ATM as an emerging switching technology, user premise devices, and video com- pression are also briefly introduced, which are discussed in greater depth in later chapters. The next two chapters are on VDT market and regulatory aspects, respectively. Chapter 2 presents some market statistics and then the discussion is extended to the nature ofthe market, market opportunities, and the competition. The impact of so- cietal changes on the VDT and the on-line services market are also examined. Chapter 3 examines various recent judiciary and legislative acts, and FCC (Federal Communications Commission) policies and rulings applicable to video services. The regulatory aspects range from various restrictions imposed on RBOCs (Regional Bell Operating Companies) and cable TV companies to their cross-ownership issues. The chapter in- cludes recent attempts in the U.S. to de-regulate and relax some of the restrictions to enhance competition. Chapter 4 is of a tutorial nature. Brief introductions to the basics of television, cable TV and fiber-optic technologies are given. Television basics cover the scanning process, how a com- posite video signal is formed, and the digitization and compres- sion of video signals. Major components of a cable TV distri- bution network, channel multiplexing in coaxial cable systems, 0 1996 John Wiley &Sons. Inc. and digital modulation techniques used in coaxial cable sys- tems are explained under the basics of cable TV. Fiber-optic technology covers SONET, analog and digital modulation techniques in optical systems, major components of a fiber ac- cess network, and network topologies. The initial part of Chapter 5 describes the logical and func- tional views of the VDT architecture and the physical network infrastructure to support VDT services. Network infrastructure is described in terms of backbone and distribution subnetworks and switching technologies. An outline of various alternatives possible in the distribution subnetwork (twisted pair, coaxial cable, fiber, and hybrid alternatives) is given. The remaining portion ofthe chapter describes the signaling and control func- tions in a VDT network. Different categories of signaling (between applications, between the end-user and the network, and between network elements) are introduced and a general functional model of signaling and control is presented. The chapter concludes by giving an overview of Bell Atlantic’s sig- naling specification, as an example to illustrate the signaling concepts. Chapters 6, 7, and 8 are on video compression, video servers, and ATM technology, respectively. Chapter 6 contains concise overviews of JPEG, MPEG-1 and MPEG-2 compres- sion standards and products. Chapter 7 explores the current state of the video server market. server technologies and costs. A server interconnection architecture is presented, followed by a detailed description of server components and alternative memory arrangements. A brief overview of some currently available video servers is included. The chapter concludes with a survey of optical storage technologies and systems. The initial part of Chapter 8 contains an overview of ATM, including the protocol model, description of protocols, and the equipment market. This overview is followed by a detailed discussion of transporting video traffic over ATM. The discussion focuses on the use of ATM adaptation layers I and 5 (AAL I and AALS) for transporting video, and the technical issues that need to be addressed (some of which have already being addressed by the ATM Forum). The next four chapters explore in detail, alternative access network technologies for the delivery of video services to the end users. Chapters 9, 10, and 11 cover ADSL (Asymmetric Digital Subscriber Line), HFC (Hybrid Fiber Coax), and FTTC (Fiber-to-the-Curb) and FTTH (Fiber-to-the-Home) distribution systems in detail, with examples of currently avail- able and proposed systems. Variants of ADSL systems (ADSL- 1, -2 and -3 ) including modulation schemes, frequency spec- trum, and network configurations are examined. The chapter on HFC (Chapter 10) includes brief descriptions of recently proposed HFC systems and a detailed description of the Time Warner network. The chapter on FTTC and FTTH explores alternative access network architectures including example sys- tems and typical costs. Chapter 12 contains an overview of other technology options. These are mainly wireless-based sys- tems, which include direct broadcast satellites (DBS) and multi-channel, multi-point distribution service (MMDS). JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE. 47(6):477-483, 1996 CCC 0002-8231/96/060477-07

Upload: alan-t-schroeder-jr

Post on 06-Jun-2016

213 views

Category:

Documents


1 download

TRANSCRIPT

Book Reviews

Video Dialtone Technology: Digital Video over ADSL. HFC. FTTC, and ATM. Daniel Minoli. New York, NY: McGraw- Hill: 1995: 495 pp. Price: $60.00. (ISBN 0-07-042724-O.)

The book is about the delivery of interactive video services from video information providers to the end users. The book explores various technologies and technical issues in fair amount of detail and. in addition, discusses the surrounding regulatory and marketing aspects. The 15 chapters include some that are of tutorial nature, containing overviews of cable television technology, video compression, and asynchronous transfer mode (ATM). These chapters provide excellent back- ground information to the unfamiliar reader for the under- standing of the core chapters, which cover the video dial tone technology alternatives, their comparative assessment and is- sues that need resolution.

The first chapter provides a broad overview of the entire field of video dial tone (VDT), summarizing a wide spectrum of topics ranging from technical, regulatory and market aspects to service development. Chapters that follow examine these in greater detail and depth. The chapter describes the role that video plays as a key driver of the Information Superhighway and how the new video delivery systems differ from the tradi- tional video broadcast networks. It discusses the convergence of the traditional TV set and the home PC into a single device, providing much greater interaction with the user than the pres- ent day TV. The chapter also contains overviews of traditional video distribution, VDT network views from the perspectives of the cable TV companies and telephone companies, techno- logical options for physical distribution of video signals, signal- ing and control, and VDT applications. ATM as an emerging switching technology, user premise devices, and video com- pression are also briefly introduced, which are discussed in greater depth in later chapters.

The next two chapters are on VDT market and regulatory aspects, respectively. Chapter 2 presents some market statistics and then the discussion is extended to the nature ofthe market, market opportunities, and the competition. The impact of so- cietal changes on the VDT and the on-line services market are also examined. Chapter 3 examines various recent judiciary and legislative acts, and FCC (Federal Communications Commission) policies and rulings applicable to video services. The regulatory aspects range from various restrictions imposed on RBOCs (Regional Bell Operating Companies) and cable TV companies to their cross-ownership issues. The chapter in- cludes recent attempts in the U.S. to de-regulate and relax some of the restrictions to enhance competition.

Chapter 4 is of a tutorial nature. Brief introductions to the basics of television, cable TV and fiber-optic technologies are given. Television basics cover the scanning process, how a com- posite video signal is formed, and the digitization and compres- sion of video signals. Major components of a cable TV distri- bution network, channel multiplexing in coaxial cable systems,

0 1996 John Wiley &Sons. Inc.

and digital modulation techniques used in coaxial cable sys- tems are explained under the basics of cable TV. Fiber-optic technology covers SONET, analog and digital modulation techniques in optical systems, major components of a fiber ac- cess network, and network topologies.

The initial part of Chapter 5 describes the logical and func- tional views of the VDT architecture and the physical network infrastructure to support VDT services. Network infrastructure is described in terms of backbone and distribution subnetworks and switching technologies. An outline of various alternatives possible in the distribution subnetwork (twisted pair, coaxial cable, fiber, and hybrid alternatives) is given. The remaining portion ofthe chapter describes the signaling and control func- tions in a VDT network. Different categories of signaling (between applications, between the end-user and the network, and between network elements) are introduced and a general functional model of signaling and control is presented. The chapter concludes by giving an overview of Bell Atlantic’s sig- naling specification, as an example to illustrate the signaling concepts.

Chapters 6, 7, and 8 are on video compression, video servers, and ATM technology, respectively. Chapter 6 contains concise overviews of JPEG, MPEG-1 and MPEG-2 compres- sion standards and products. Chapter 7 explores the current state of the video server market. server technologies and costs. A server interconnection architecture is presented, followed by a detailed description of server components and alternative memory arrangements. A brief overview of some currently available video servers is included. The chapter concludes with a survey of optical storage technologies and systems. The initial part of Chapter 8 contains an overview of ATM, including the protocol model, description of protocols, and the equipment market. This overview is followed by a detailed discussion of transporting video traffic over ATM. The discussion focuses on the use of ATM adaptation layers I and 5 (AAL I and AALS) for transporting video, and the technical issues that need to be addressed (some of which have already being addressed by the ATM Forum).

The next four chapters explore in detail, alternative access network technologies for the delivery of video services to the end users. Chapters 9, 10, and 11 cover ADSL (Asymmetric Digital Subscriber Line), HFC (Hybrid Fiber Coax), and FTTC (Fiber-to-the-Curb) and FTTH (Fiber-to-the-Home) distribution systems in detail, with examples of currently avail- able and proposed systems. Variants of ADSL systems (ADSL- 1, -2 and -3 ) including modulation schemes, frequency spec- trum, and network configurations are examined. The chapter on HFC (Chapter 10) includes brief descriptions of recently proposed HFC systems and a detailed description of the Time Warner network. The chapter on FTTC and FTTH explores alternative access network architectures including example sys- tems and typical costs. Chapter 12 contains an overview of other technology options. These are mainly wireless-based sys- tems, which include direct broadcast satellites (DBS) and multi-channel, multi-point distribution service (MMDS).

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE. 47(6):477-483, 1996 CCC 0002-8231/96/060477-07

Each option is described with the corresponding network con- figurations.

Chapter 13 covers the user-premise equipment, mainly the setup box (or. the “set top box”), with a brief description of the (evolving) TV set. Functions and capabilities of the set top box are described in depth, with greater emphasis on the signaling capabilities. Security issues related to the set top box are exam- ined. The chapter concludes with a brief description ofthe 1994 version of the Macintosh TV, as an example of a device that had combined functionality of a home computer and a TV set.

The last two chapters are on service providers. Chapter I4 includes a discussion of telephone companies (mainly the Re- gional Bell Operating Companies), with their proposed strate- gies. plans and technology trials. Chapter 15 describes the plans and activities of Cable TV, telecommunications, and computer companies who would also play an active role in the video ser- vice development and delivery.

Overall, the book is very well written and well worth reading for technical readers to gain a broad, sufficiently in-depth tech- nical knowledge on this rapidly evolving technology area, and for other readers to gain an appreciation of video and related technologies, the VDT market, and regulatory aspects. (The discussion on regulatory environment is applicable only to the U.S.). Each chapter contains references for the benefit of read- ers who desire more detailed information on a particular aspect of a topic.

Bandula Abeysundara, Ph.D. Engineering Specialist VJSTAR Telecommunications. Inc. Szzite 1410, 427 Luwier Ave. IV. Ottuwu, Onturio, Cunudu, KIG 354 E-mail: ahc~~.szmd~vi.stur.ca

Designing and Writing Online Documentation: Hypermedia for Self-Supporting Products. Second Edition. William Horton. New York, NY: John Wiley: 1994: 439 pp. Price: $29.95. (ISBN O-47 I-30635-5.)

The information explosion has spawned a silent, but equally lethal, partner-the “documentation explosion.” As the tech- nological infrastructure that surrounds our lives becomes more complex, so does the documentation that supports this ma- chinery. For example, mechanics servicing the World War II fighter aircraft had a thousand page manual at their disposal. Today, the mechanics working with the B2 stealth bomber must master about one million pages of documentation before they can service their aircraft. It has been estimated that Navy ships currently carry ten to forty tons of paper manuals and forms on each voyage. raising the ship’s center of gravity and reducing its speed and maneuverability (Horton, 1994, p. I - 2). Our society seems ready to drown in a sea of paper.

Substituting online documentation for paper manuals is one solution to this problem. In its simplest form, online docu- mentation uses the computer as a medium to communicate instructional material, which can include everything from a two-line electronic mail message to online books and maga- zines. Preparing effective online documentation involves more than simply transferring the contents ofa book to the computer screen, however, and in his book Designing and Writing Online Documentation: Hypermedia .fbr Se!f~Szzpporting Products, William Horton provides a comprehensive and easily un- derstood guide to this process. An updated version of Horton’s

earlier book Designing und Ct’riting Online Doczmwntation: Jfelp Files to Jfypertrut, this edition considers online docu- mentation as a unique medium rather than as a substitute for paper publications. Reflecting the increased use of multimedia computing, this edition also includes more examples of graph- ics. animation. sound. video, and interactivity.

Horton presents a systematic introduction to the primary tasks involved in designing and producing online documents: planning and organizing document content (including the se- lection of topical links). writing instructional dialog (including the design of screen displays and the development of “user friendly” syntax). the design of supportive ( not superfluous) graphics, and the inclusion of multimedia enhancements such as sound, music, and voice. Horton also includes separate chapters on online “help” and computer-based training.

Horton’s book is an excellent example of the instructional design principles he advocates in his text. Each chapter is thoughtfully designed for the reader’s cognitive convenience. Complex concepts are subdivided (or “chunked,” as Horton says) into easily understood one and two paragraph sections. A simple design scheme of dark and lightly shaded headings indicates topics and their subtopics, so that the reader is con- tinually aware of the conceptual organization of each chapter. Information is organized according to the kind of data being presented; for example. sequential information is displayed in linear form, and hierarchical information is depicted using de- cision “trees” and flowcharts. Graphics are used to illustrate key ideas and summarize content, and a “Putting These Ideas to Work” section restates the essential ideas of each chapter in a practical, action-oriented context.

Underlying this guide’s well-written text and excellent graphics is the thought that the cognitive design of the project, not the “glitziness” of the software used to produce it, should drive the content ofonline documentation. Powerful computer hardware promotes an exponential increase in the sophistica- tion of the software that supports its use. Many designers inter- pret this condition as a license to include a distracting number of “bells and whistles” in their products that serve only to con- fuse the user. Throughout his book, Horton cautions designers to use technology in the service of instructional purpose rather than as an end in itself.

Designing and bt i-iting Online Doczmwntution: Jl~ywrmc- diujiw S&Szrpporting Prodzrcts would be an appropriate text- book for an introductory course in the instructional design of online documentation. Its effective mix of research informa- tion and practical advice would also be useful for writers and designers of online documentation. Because each chapter can be read independently of the rest of the book with no loss of comprehension. this book is an ideal reference text as well as an instructional guide.

In a world where information threatens to choke us all, Wil- l iam Horton provides an attractive. cost-effective alternative to paper-based publications. Indeed, the design and production principles that Horton includes in this book may encourage the development ofan online product that is an improvement over the original.

Mary Ellen Litzinger Edzilrcation Lihruriun Pawn State University Lihruries Univer.sity Purk, Pa 16802 E-mail: mc~l~pszllia,s.pszz.~dzz

478 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-June 1996

Innovation and the Library: The Adoption of New Ideas in Pub- lic Libraries. Verna L. Pungitore. Westport, CT: Greenwood Press; 1995: 189 pp. Price: $49.95. (ISBN O-3 13-28673-6.)

Professor Pungitore has written a book about innovation and the diffusion of ideas in public libraries. Unfortunately, for most practicing public librarians, the words “innovation” and “diffusion of ideas” with regard to the daily operations of a pub- lic library are a misnomer. Written from a macro policy per- spective, Innovation and the Library begins with an interesting narrative history of public libraries in general and ends with a lengthy, and somewhat tedious, discussion of the Public Li- brary Development Project (PLDP). Along the way we get some discussion on organizational change, primarily in Part I. Surprisingly, of a book so entitled, there is little mention of technology, specifically, computers and the Internet are barely mentioned. Pungitore does state at the beginning of her book “This is a book about organizational change and innovation . . .) ” (p. 1). However, to completely ignore the impact of computers and telecommunications on public libraries at every service level does a disservice to the new library and informa- tion student and policymakers alike. The greatest organiza- tional change in public libraries over the past ten years has been the gradual elimination of card catalogs, and the many library employees and services that maintained them due to online public access catalogs (OPAC). For many public librarians the introduction of technology is the only innovation. or new idea, experienced in the past twenty years.

In Part 1. discussion of change and innovation in public li- braries provides a social history of public libraries up until 1965. The author parallels U.S. and public library history from the Civil War era to the revised Minimwn Stundurds,fijr Pzlblic Librurv Systems, 1966 ( Public Library Association Standards Committee). Interspersed is writing on diffusion and innova- tion, primarily in chapter 2. The author admits “The intent is not to present a comprehensive, scholarly review of the litera- ture” (p. 25 ), however, the discussion is based almost entirely on the writings of Havelock 1968 & 1969 and is somewhat dated and limited to the field of education. There is an interest- ing discussion on organizational change in chapter 1 that is too short; it is worthy of further explication. Overall, Part I suffers from the author’s inability to integrate the vast body of organi- zation theory literature into public library settings. Perhaps in her next book?

Part II of Znnovation und the Library equates the history of the Public Library Association with change and innovation in public libraries. The author states, “It is a central thesis of up- coming chapters that the Public Library Association has taken on an increasingly more aggressive and effective change agent role during the past two decades” (p. 7 1). By virtue of accred- itation practices. integration of research findings, and chrono- loging the experimentation by Baltimore County Public Li- brary in the area of planning and performance measurement Pungitore makes the above observation. The author goes into great detail about the birth of PLA, the transpiration of De Pro- spo’s, et al. Pc~~~)rmun~eMeusures.for Public Libraries ( 1973). and “the experimentation with planning and performance measures that was occurring at the Baltimore County Public Library” ( p. 85 ). The scope of Pungitore’s “change agent” cri- teria is somewhat limited and in keeping with the macro ap- proach of this book. There are no statistics showing a relation- ship between the activities of PLA and public libraries current and past practices. The effectiveness of PLA at the implemen- tation level is not addressed specifically. An interesting para- digm for any professional association. aptly characterized by Pungitore. “well, what can PLA do? We need to meet the local needs of our constituents. the local librarians, and be more re- sponsive to them” (p. 80). Pungitore also notes, “In I98 1. the

PLA Executive Board formally endorsed the move from na- tional standards to local community-based planning, supple- mented by state standards or guidelines” (p. 95-96). The Cali- fornia Code alone has half of one volume dedicated to library services. a far greater indicator of change and innovation at the macro level than the ethereal PLA. Various city codes are also another indicator. Part II ends with a thorough, if somewhat tedious, discussion of the Public Library Development Project (PLDP).

Part III begins with a tertiary discussion of the diffusion of ideas among smaller public libraries. More professionals ac- tively involved in professional activities. concludes Pungitore. along with greater state agency involvement and training of li- brary staff “. . may facilitate their (smaller public libraries) fuller participation in the communications network that links persons and organizations in the public library field” (p. 135). The chapter on implementation of the PLA Planning Manual offers no quantitative data and has a limited and varied public library sampling size. The different libraries are presented more as case studies. The final chapter states, “Facilitating innova- tive organizational change in a well-established, but historically underfunded, bureaucratic institution such as the public li- brary presents a difficult challenge” ( p. I7 1). A syndrome not unlike “Gresham’s Law” of planning coined almost forty years ago by March and Simon ( 1958), “Daily routine drives out planning.” In a larger sense, organizations consumed with day- to-day operations have a very difficult t ime planning for change. The scarcer the resources the slower change comes about. One only needs to reread For thr Pc~~pk: Fighting,fi,r Public Libruries ( 1979) and see that not much has changed in public library funding in this age of deficit reduction and balanced budgets.

Overall, Innovution and the Librurl> suffers from its limited scope (the PLA as change facilitator/ innovator) and undevel- oped concepts. A more relevant study would be connecting PLA activities to increased public library funding and services. The author goes over far to much library history (the book is only 180 pages long) and averts several key change issues in public libraries: computers: telecommunications; part-time staff: decreased funding and increasing media costs; incorpora- tion into. and larger role for, libraries in community services divisions ail go unaddressed. The book is worthwhile to library students and public libraries for its PLA history.

Alan T. Schroeder, Jr. MLIS/MPA Associutr Librurinn C’omprtrr/A z~diovi.wul Serviccc~ Suntu Fe Springs C‘it!! Librur~~ Suntu FL> Springs, CA 906 70 E-muil: Suntuf~~.splib~eurthlink.net

References

March, .I. G.. andSimon. H. A. ( 1958). Orp~~ixtion., (pp. 185-187). New York: John Wiley.

Seymour, W. N.. Jr.. and Layne. E. N. ( 1979). ForthcP~~op/c,-E’iXh/,nR ,/iv Pz/h/ic L!hrurica Garden City. NY: Doubleday.

Global Perspectives on the Ecology of Human-Machine Sys- tems. John M. Flach, Peter A. Hancock, Jeff Caird, and Kim J. Vicente, Eds. Hillsdale. NJ: Lawrence Erlbaum Associates; 1995: 424 pp. Price: $36.00 (ISBN 0-8058-l 382-9.)

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-June 1996 479

Local Applications of the Ecological Approach to Human-Ma- chine Systems. Peter A. Hancock, John M. Flach. Jeff Caird. and Kim J. Vicente, Eds. Hillsdale, NJ: Lawrence Erlbaum As- sociates; 1995: 488 pp. Price: $39.95. (ISBN 0-8058-l 380-2.)

There is an old adage that there is nothing so practical as a good theory. In the area of human factors engineering (or ergonomics), good theories have been scarce indeed. Too of- ten, human-machine system design is reduced to localized problem-solving with little or no scientific foundation for de- sign decisions.

In these two volumes. contributors from academia, indus- try, and government provide the theoretical framework and some of the necessary research findings for a human factors engineering discipline. The foundations for the psychological theory are found in the work of Egon Brunswick in the late 1950s and, in later decades, that of James Gibson. This school of thought has been christened “ecological psychology” for its emphasis on explanations of human behavior rooted in the re- lationship or interplay of the individual to the environment. Ecological psychology has historically been at odds with the more conventional “organismic” psychology, which stresses the importance of defining the characteristics of the individual. independent ofthe surrounding environment. Readers without the psychology background might find these distinctions rather esoteric. However, as the contributors in these volumes point out, the organismic approach generally sacrifices ecological va- lidity in its research by removing the (contaminating) influence ofthese “real-world” variables. From the ecological psychology viewpoint (as expressed in these volumes and elsewhere), this lack ofecological validity has been the main reason for the gen- eral ineffectiveness of conventional psychology in influencing system design.

The first volume, of this two volume series, addresses the theoretical elements of ecological psychology and its relation- ship to human factors, How ecological psychology differs from the conventional, organismic view of human behavior and how the ecological view can serve as a more effective basis for hu- man-system interface design, take up the first half the first vol- ume of the series. For those not having a background in psy- chological theory, the arguments put forth in these chapters may prove difficult to follow. This is particularly true when the authors turn to psychological model development, representa- tional design, and psychophysics. Also, as is often true of those holding strong theoretical beliefs, the contributors sometimes belabor points of departure between the ecological and organ- ismic viewpoints. A plethora of turgid phrases such as “agent- environment mutually assumption” and “dynamics of per- ceivable tokens/forms” frequently serve to obscure, rather than clarify, important theoretical issues and important rela- tionships between ecological psychology and the practical dis- ciplines of human factors engineering. Despite these difficul- ties, the early chapters of Volume I are worth reading because they provide the necessary bridge between the often arcane, theoretical issues and the discipline and practice of human fac- tors.

The second half of Volume 1 addresses the essential, theo- retical concept of affordances and their role in ecological psy- chology and in systems design. Affordances was a term origi- nally coined by Gibson and later popularized in Don Norman’s Prvciwhgy ofEver,vduy Things. Briefly, affordances are prop- erties of an individual’s environment that support actions of the individual in performing a task. The chapters in the latter half of Volume 1 represent one of the most comprehensive dis- cussions of the concept of affordances currently available.

Volume 2 of the series addresses the practical matter of designing systems for human use. Examples of applications of the ecological approach to system design range from vehicle

control to organizational behavior. However, much of the vol- ume is devoted to research and applications in visual percep- tion, particularly in vehicle control, locomotion, and telepre- sence. The reasons for the emphasis on vision is due, in part, to the historical roots of ecological psychology in visual percep- tion and, in part, to the fact that most of the supporting research to date has been in areas such as motion perception and manual control. Some experimental findings are presented in areas such as topographic map reading, decision-making and prob- lem-solving. and in organizational support system design. Con- spicuous by its absence, however, is any substantial work ad- dressing the design of human-system interfaces in the more mundane. but more common information processing task en- vironments. Research demonstrating the application of the ecological approach to a variety of information processing tasks, such as word processing or information retrieval, is es- sential for the ecological perspective to find a wider audience in the design community.

In summary. these two volumes are important contribu- tions to establishing a science of human engineering which can provide reliable and practical guidance in optimizing system design. Those involved in both the theoretical, as well as the practical aspects, of human factors engineering will find the volumes both useful and thought-provoking.

Alfred T. Lee Beta RtJsearch, Inc. P.O. Box 2713 Cupertino, CA 95015 E-mail: [email protected]

Software and Intellectual Property Protection: Copyright and Patent Issues for Computer and Legal Professionals. Bernard A. Galler. Westport, CT: Quorum Books; 1995: 205 pp. Price: $55.00. (ISBN O-89930-974-7.)

The legal system has a history of producing copyright and patent law to protect various forms of intellectual property. However, in the last 15 years, developments in technology and software have revolutionized the potential application of these laws. This book focuses on the creation and protection of soft- ware as intellectual property, with particular emphasis on the body of case law which has emerged over this time period. While the Copyright Act of 1976 was the genesis of the most comprehensive protection legislation in terms of intellectual property rights ever developed and the 1980 amendments helped to connect the original act with the various new tech- nologies, computer software does not fit neatly into either copy- right or patent law and, therefore, has struggled within the con- fines of existing legislation.

The author, Bernard A. Galler, is eminently qualified to au- thor such a work. Professor Emeritus at the University of Mich- igan. he has had considerable experience serving as an expert witness in cases involving intellectual property rights. His biog- raphy note indicates that he is “founder and president of the Software Patent Institute, founding editor-in-chief of the An- nals of the History of Computing, and a former president of the Association for Computing Machinery.” He is also the author of The Language ofComputers ( 1962) and A View ofprogram- ming Languages (with A. J. Perlis, 1970).

This book presents its message through an extensive presen- tation of relevant case studies grounded in actual court cases. The reader is invited to examine these cases and the briefly

480 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-June 1996

summarized arguments in order to think critically about the issues involved. The expected audience includes “lawyers. computer professionals, and anyone interested in Intellectual Property Law as it applies to computer software.” Both copy- right and patent issues are discussed, as further illustrated by appropriate case law examples; however, trade secret law was deemed to contain little pertaining to the computer and, there- fore. has not been included. Galler states that “computer soft- ware appears to fit rather well under existing law in this area” (P. 2).

As with all copyright and patent law. interpretations relating directly to computer software have wavered between the poles of providing protection and not providing protection. Particu- larly because legal applications in this arena are relatively re- cent. it has been-and will continue to be-necessary for case law and correlated precedents to evolve over time. Caller leads the reader through this morass of case law and provides perti- nent examples to highlight various legal milestones.

In Chapter 1, legal issues are introduced. Caller states that. “much of the litigation in the computer field is really not spe- cific to computers” (p. 7) and cites breach of contract, misap- propriation of trade secrets and anti-trust violations as issues that are more general in nature. A paradox is presented: com- puters are machines and, as such, can be patented. Then how shall software be defined? Since software in and of itself is meaningless until loaded into a computer and, once loaded, could be defined as being “part” of the computer, is the soft- ware then part of the machine?

Other issues and arguments presented in the following chap- ters include (pp. 8-9):

. Sofiwarc is utilitarian: that is. its only purpose is to cause a machine to follow some instructions to solve a problem. There is no expression involved that deserves copyright protection.

l If software is written in a high-level language on paper. punched on cards. or stored in random-access memory (RAM). it is different from machine-language zeros and ones stored in read-only memory (ROM). and different kinds of protection are needed.

. While it is true that application programs written by userslook like literary expression and should be copyrightable. operating systems arc not visible to the user, and are really part of the computer. hence not subject to copyright protection.

l If a company is licensed to copy another company’s hardware. it is entitled to copy the microcode, also. because there is really only one way to write microcode efficiently. Microcode is also an integral part of the control unit, which is the heart of the computer; therefore. it is hardware.

l What appears on the screen cannot be protected because it represents the idea or the design of the user interface, and not the expression. Besides. certain screen presentations are by now so standard that they can be used freely by anyone. In order to compete, another company must make its product look the same to the user. It is better for consumers if they do not hav,e to learn several different ways of doing things.

l How can a company undertake any kind of software develop- ment when it must always face the prospect that someone will claim copyright or patent infringement? Is there any way to somehow protect the process of software generation so copy- right and patent infringement charges are simply recognized as untenable? Can this be done without incurring unreason- able expense?

While some of these issues have already developed prece- dents i case law, others have either not yet been tested or have not bee decided definitively. The issues are complex and will undou

% ’ tedly affect the future development of the software in-

dustry. Chapter 2 tackles the concept ofthe difference between idea

and expression. When is a product an “idea” that leads to the development of a patentable invention? And when is the idea a first step whose further expression requires copyright protec- tion? One possible permutation is that it can be demonstrated that there are a large number of ways to write a computer pro- gram, all of which may be expressions of the same idea but different from each other-therefore, what does it mean to be different, or similar?

In Chapter 3, software patents are discussed. As mentioned earlier, since copyright law generally was associated with liter- ary and artistic expression, and patent law dealt primarily with machines and processes, the distinction between the two was fairly understandable. However, with the confusion engen- dered through the questioning ofwhether software was a stand- alone expression-or integral with the machine, the boundary between the two forms of protection law became blurred. As in the other chapters. case law examples are given in order to shed light on this complex question.

Chapter 4 returns to the discussion of copyright law in ref- erence to that portion of the 1976 Copyright Act which states in part: “Copyright protection subsists. . in original works of authorship fixed in any tangible medium of expression. . . .” Therefore, is software embodied in such a “tangible medium” as a screen display to be governed by copyright law? Cases are presented that also pose questions concerning source code and object code.

The concepts of validity and scope are featured in Chapter 5. Some arguments ask: Should the copyright have been issued at all for a particular software; or Was the copyright marker visible on the product? As before. examples develop the various points of view related to these arguments. Infringement is the subject of Chapter 6. Caller defines infringement as follows: “Given the exclusive, but limited, rights reserved to copyright holders in the Copyright Act. infringement would be actions that violate those rights without permission, subject to the ex- ceptions that limit those rights.” (p. 67) Discussion in this chapter covers piracy, excessive student collaboration in the ac- ademic environment, and situations when there is “no admit- ted copying and no obvious smoking guns.” (p. 69) The gray area of similarity is presented-whether a given instance is di- rect copying or simultaneous development. Substantial sim- ilarity in Chapter 7 carries this discussion further.

The Look and Feel of two pieces of software for the basis for Chapter 8, with competing spreadsheets are presented as examples. Chapter 9 examines Reverse Engineering: the obser- vation, study and testing of a program and the controversies that have emerged when this practice is applied to computer software. Arguments regarding this issue concern using the same computer to make a number of modifications to the orig- inal program (preserving the essential characteristics but mak- ing the program look different from the original) and also to deduce trade secrets from examination of the source code. It is here that the concept of “Fair Use” is discussed.

The final substantive chapter considers the Clean Room Ap- proach, which is a new area for legal argument. This approach attempts to provide a competitor with a basis for defense by demonstrating that the product in question was independently developed, as the people involved in the development were in an isolated (clean room) environment. Since case law is not yet available, Galler gives guidelines for developing a “clean room.”

Perhaps a major contribution of this book (and one cer- tainly admired by the author of the book’s “Introduction”) is Appendix A, which offers a review ofthe fundamentals ofcom- puter technology. This clearly constructed review provides an excellent foundation for readers who are less well-versed in computer technology. In addition, three additional appendices are available to the reader. Appendix B gives an example of a

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-June1996 461

patent; Appendix C provides a concurring legal opinion: and Appendix D contains a citations list for the cases discussed in the main body of the book. There is also a brief list of recom- mended reading.

Galler has made readable an extremely complex set of is- sues. For those interested in following the intricacies of intel- lectual property, this book provides an approachable entr6 into what may be an impossibly confusing body of literature. It is a well-written text and a good introduction for the novice; at the same time, it has depth for the more knowledgeable reader.

Darlene E. Weingand, Ph.D Professor University of Wi.sconsin-Madison School qf Librql und Iqformation Studies 600 North Park Street Madison, U ‘I 53 706 Emuil: WElNGAND@DOIT. U TSC.EDL!

The Artificial Life Route to Artificial Intelligence: Building Embodied, Situated Agents. Luc Steels and Rodney Brooks, Eds. Hillsdale, NJ: Lawrence Erlbaum: 1995: 288 pp. Price: $29.95. (ISBN O-8058-1519-8.)

This collection deals with an alternative route to the elusive goal of AI-forsaking the “traditional” problem-based ap- proach, which attempted to produce “machines which think,” the researchers taking the Alife path concentrate on developing machines which “behave,” or “function,” or “act,” as the term “agents” implies. Rather than try to develop a replica of human problem-solving, for example, this approach concentrates on the performance of a machine in its environment. with the em- phasis being placed on real-world environments, not the “block worlds” beloved of the more traditional problem solvers.

Performance of so-called lower-level functions, the sensori- motor functions rather than the cognitive ones. is felt to be where the elusive goal may lie-“Cognition depends on the kinds of experience that come from having a body with various sensorimotor capacities” (Varela, p. 17). This explains the in- clusion of “embodied” in the title: “situatedness” is a matter of context, hence the emphasis on real-world settings, so that the agent is posed environmental challenges to which it must re- spond appropriately.

Rather than working from, or attempting to construct. an overall “map” or representation of its environment, the agent deals with “microworlds,” each relevant to some particular as- pect of its involvement with that environment.

Attention moves from how creation and manipulation of symbols may be accomplished. to considering which linkages between perception and motor systems are necessary to bring about an efficient agent.

The preface summarizes Francisco Varela’s introductory chapter, The Rc-Enchantment ofthe Concrete, in terms such as those used above, and the chapter is a similarly clear and con- cise statement of what might be described as the philosophical position underlying Alife. at first sight an extreme form of re- ductionism, dispensing even with a thinking machine, in the now-commonplace sense of the term. Just attend to the muchina, and let the dew look after itselE Varela appears to be saying: “Cognitive structures emerge from recurrent patterns of perceptually guided action” (p. 20).

Varela’s arguments are persuasive, and he rather convinc- ingly describes everyday experience as a movement through a

series of microworlds. for each of which recurrent entities we have a readiness-for-action. The interesting part, he postulates, is the hinge which holds these microworlds together. the organ- izing agency which gets the organism from one microworld to the next, and it is in this. he argues, that autonomy is consti- tuted.

Because the perceiver reacts with the perceived world, and its relationships with the world. and the world perceived, are mutually modifying. perception must be conceived in terms of the perceiver’s interaction with the world perceived. The world has changed, because the perceiver is at work in it-it exists only as something which is perceived and interacted with, there is no objectively existing perceiver-independent world. Bishop Berkley. Hume, and Heisenberg neatly disposed of, Varela slips us the offhanded. “it is contrary to the views familiar to us from the Cartesian tradition.”

He postulates “fast dynamics” between competing micro- worlds to emerge from chaotic neural activity, as the “cradle of autonomous action,” using here what he has explicitly denied himself in the argument before: An instantiation of “agents” at a neural level. The example is of a rabbit’s cortical activity on recognition of a familiar smell, when a pattern emerges from a chaotic background of cortical activity. and remains until the smelling action is over. Varela goes on to observe this behavior in other species, and speculate that it “suggests” that all the distinct agents activated by the current situation are competing to be selected by their similarity to the situation in hand. This activity occurs not only in sensory areas of the brain. and “is a very good candidate for the neural correlate ofthe autonomous constitution of a cognitive agent.”

This last series of steps is neatly taken, but appears in context to be the author’s own argument unsupported here by other evidence. The next step. however, is the big one, to get from this sensorimotor activity to something recognizable as a cognitive process.

Brooks’ research programs chapter gives the background history in the fields of knowledge of the environment and vi- sion. and describes work designed to achieve embodiment of these features in mobile robots. He also examines some of the assumptions which AI has based on the biological sciences. He then expands on the concepts of situatedness. embodiment. in- telligence, and emergence, and argues that:

Agents do not need an objective model of the real world, but can refer to the real uaorld itself--“the world is its own best model”: Giving an intelligent agent a body ensures that it is forced to deal with real challenges. and that its reasonings about its sur- roundings are “grounded” in real entities: Intelligence is based on interaction with the real world: lntclligcncc is an emergent property. it is determined by the total behavior of the system, rather than seated in any single component or ability.

Brooks discusses his work at MIT since 1984. building mo- bile autonomous robots which can interact in real-time with a real. changing environment. that of an office with people mov- ing through it, and subject to changes in furniture and lighting. Examples of robots which have been constructed are discussed in relation to issues such as convergence, synthesis, and learn- ing of behaviors, and complexity and representation of envi- ronments.

Steels’ chapter, Building .+gnts out (?!‘,~IIIo~o~oI~J Behuw iour S~:stem.s. explains a “behaviour-orIented approach,” in which each behavior system. for example collision avoidance, extracts from the outside world only the information necessary to it. decides when to become active. and initiates the appropri- ate action. Each system is its own control locus, but potential conflicts are avoidable because behaviors can either take place

482 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-June 1996

in parallel, the external circumstances will determine which is dominant, or, in the case of a genuine potential conflict, moti- vational variables are introduced.

Implementation of these behavior systems is discussed at some length, and the discussion is complemented by pseudo- code, diagrams, and photographs of a robot in operation, Un- fortunately, the photographs are somewhat lacking in clarity.

Smithers’ chapter argues against the simplistic view that in- telligent systems are information processing systems. He gives three “sketches” of real robots, constructed to deal with the problems of, respectively, “not getting stuck,” map-building and self-organization, and computation and real control of movement.

He draws the conclusion that an agent’s reality is constituted in its dynamic relationship with its environment, the agent’s internal sensorimotor structure changing dynamically as a re- sult of its interactions with its environment, and continuously modifying its response to those interactions. Smithers, then, has taken steps towards the construction of autonomous agents, and holds that autonomy is a necessary, if not a suffi- cient, condition for intelligence.

The two technical contributions by Mataric and MacFar- land deal with the specific issues of maps for navigation and cost-optimizing functions.

The collection concludes with five Position Papers, each written by practitioners in one of the fields which are relevant

to the construction of intelligent autonomous agents: Robotics. knowledge engineering. psychology, classical Al, and the philo- sophical approach.

These short papers are perhaps the most exciting part of the collection, in that they analyze critically the current state of the field, and present the directions in which the authors feel that work should now be progressing.

To a reader familiar with the “classical” approach to Al, this collection comes as both a suprise and a rather invigorating new approach to a field which had seemed to “stall” somewhat in the last few years. It should certainly stimulate discussion, and it is to be hoped that further productive work will result.

A final quibble with the title-there may be possible such a thing as AI, and with many, it is something of an article of faith that there can be. However, until we know W/WW it may lie, it must remain presumptuous to designate anything “Thex route to AI.”

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE-June1996 483