roomware----moving toward ubiquitous computers

12
36 In the past, a central mainframe computer provided terminals for many users. In the current age of the personal desktop computer, there are one or more computers for each person. It’s only in rare cases that these computers work together to jointly pro- vide functions. In the future, computational power will be ubiquitous. In such an envi- ronment, devices that support humans inter- acting with information at work and in everyday activities will complement the desk- top computer. These devices will be closely interconnected and integrated with the envi- ronment and context in which people use them. They indeed will synchronously work together to support fluent collaboration. Collaboration between users and environ- ments with multiple interconnected devices will determine, to a large degree, approaches to work and everyday activities. An example of this type of device is roomware, or comput- er-augmented objects resulting from the inte- gration of room elements, such as walls, doors, and furniture, with computer-based infor- mation devices. The roomware components that we have developed at Fraunhofer IPSI support the vision of a future where our sur- roundings act as an information interface, and the computer as a device disappears from our perception. Three main observations influ- enced the creation of roomware components: 1 the growing importance of information technology, the need to integrate information tech- nology with the environment in which it is used, and the recognition that new work practices will emerge to cope with the increasing rate of the innovation. Growing importance of information technology Only a few remaining domains of modern social and economic life are independent of information technology. Within the working context, it’s almost a requirement to use dig- ital information as a major part of everyday work. The desktop computer as a device is ubiquitously present in every office. This idea was further pursued by Mark Weis- er, who coined the term ubiquitous computing. 2 We interpret this view as the design goal of a two-way augmentation and smooth transition between real and virtual worlds. Combining real and virtual worlds in a computer-aug- mented environment—resulting in hybrid worlds—lets us design enabling interfaces that combine everyday reality and virtuality. We seek to use the best aspects of both worlds. Our goal is to transform and transcend human-computer interaction resulting in rather direct human-information interaction Peter Tandler Norbert Streitz Thorsten Prante Fraunhofer Integrated Publication and Information Systems Institute COLLABORATIVE WORK APPROACHES, FACILITATED BY UBIQUITOUS COMPUTING, WILL SHAPE FUTURE WORKING ENVIRONMENTS. ROOMWARE COMPONENTS INTEGRATE EVERYDAY OFFICE FURNITURE WITH COMPUTER- BASED INFORMATION DEVICES. 0272-1732/02/$17.00 2002 IEEE ROOMWARE—MOVING T OWARD UBIQUITOUS COMPUTERS P. Tandler, N. A. Streitz, Th. Prante Roomware—Moving Toward Ubiquitous Computers. In: IEEE Micro, Nov/Dec, 2002. pp. 36-47

Upload: independent

Post on 13-May-2023

1 views

Category:

Documents


0 download

TRANSCRIPT

36

In the past, a central mainframecomputer provided terminals for many users.In the current age of the personal desktopcomputer, there are one or more computersfor each person. It’s only in rare cases thatthese computers work together to jointly pro-vide functions. In the future, computationalpower will be ubiquitous. In such an envi-ronment, devices that support humans inter-acting with information at work and ineveryday activities will complement the desk-top computer. These devices will be closelyinterconnected and integrated with the envi-ronment and context in which people usethem. They indeed will synchronously worktogether to support fluent collaboration.

Collaboration between users and environ-ments with multiple interconnected deviceswill determine, to a large degree, approachesto work and everyday activities. An example ofthis type of device is roomware, or comput-er-augmented objects resulting from the inte-gration of room elements, such as walls, doors,and furniture, with computer-based infor-mation devices. The roomware componentsthat we have developed at Fraunhofer IPSIsupport the vision of a future where our sur-roundings act as an information interface, andthe computer as a device disappears from ourperception. Three main observations influ-enced the creation of roomware components:1

• the growing importance of informationtechnology,

• the need to integrate information tech-nology with the environment in which itis used, and

• the recognition that new work practiceswill emerge to cope with the increasingrate of the innovation.

Growing importance of informationtechnology

Only a few remaining domains of modernsocial and economic life are independent ofinformation technology. Within the workingcontext, it’s almost a requirement to use dig-ital information as a major part of everydaywork. The desktop computer as a device isubiquitously present in every office.

This idea was further pursued by Mark Weis-er, who coined the term ubiquitous computing.2

We interpret this view as the design goal of atwo-way augmentation and smooth transitionbetween real and virtual worlds. Combiningreal and virtual worlds in a computer-aug-mented environment—resulting in hybridworlds—lets us design enabling interfaces thatcombine everyday reality and virtuality. Weseek to use the best aspects of both worlds.

Our goal is to transform and transcendhuman-computer interaction resulting inrather direct human-information interaction

Peter TandlerNorbert Streitz

Thorsten PranteFraunhofer Integrated

Publication and

Information Systems

Institute

COLLABORATIVE WORK APPROACHES, FACILITATED BY UBIQUITOUS

COMPUTING, WILL SHAPE FUTURE WORKING ENVIRONMENTS. ROOMWARE

COMPONENTS INTEGRATE EVERYDAY OFFICE FURNITURE WITH COMPUTER-

BASED INFORMATION DEVICES.

0272-1732/02/$17.00 2002 IEEE

ROOMWARE—MOVING TOWARDUBIQUITOUS COMPUTERS

P. Tandler, N. A. Streitz, Th. Prante Roomware—Moving Toward Ubiquitous Computers. In: IEEE Micro, Nov/Dec, 2002. pp. 36-47

and human-human cooperation, making com-puters disappear. In our approach, we distin-guish between two types of disappearance(http://www.ambient-agoras.org). Physicaldisappearance of computer devices comesabout by making the computer-based partssmall enough to fit in the hand, interweavewith clothing, or attach to the body. Mentaldisappearance of computers occurs when theybecome invisible to the user’s mental eyes. Theimportant aspect here is that humans do notperceive the devices as computers anymore,but as embedded elements of augmented arti-facts in the environment. In our actual designs,we combine these two types of disappearance.

Integration with the environmentWhen the computer disappears in the sense

defined previously, developers will need towiden their concept of computers and infor-mation technology. Their focus must begin toinclude the environment and context in whichthe computer is used. Architecture, as onemajor part of the surrounding environment,has a high degree of influence in determiningour living and working habits. (Here, we usearchitecture to refer to the architecture of build-ings, not software.) In addition, interactionwith physical objects both as sources of infor-mation (such as books, magazines, or draw-ings) and tools or furniture is natural andnormal for the way people live and work.

This is in sharp contrast to the way peoplework with digital information today, whichreduces interaction with information to inter-action with computers in a very limited way.The computer has no information availableabout where it is located. It has no clue aboutthe physical distance to other devices con-nected over the network; it’s only aware ofdirectly connected devices, such as its moni-tor, keyboard, mouse, and, perhaps, a printeror scanner. A computer cannot figure outwhether or not there is a large public displayclose by, which could show information usedfor a discussion among several people. It nor-mally would not even sense that several peo-ple are trying to look at its small screen,because most current operating systems enableapplication use by a single user only.

Besides the need for awareness of the phys-ical context in which a device is used, it’s alsohelpful to make other types of context avail-

able.3-4 Examples include the number of peo-ple present, their roles (in terms of team mem-bers or visitors), and their current tasks. Thistype of context information is closely relatedto the third observation.

New work practicesLooking at work environments, work is

increasingly characterized by a high degree ofdynamics, flexibility, and mobility. Initialexamples include such new work practices ason-demand and ad hoc formation of teams.Contents and participants as well as contexts,tasks, processes, and structures of collabora-tion will change frequently to cope with theincreasing rate of innovation.

There is empirical evidence that teams workmore effectively if they develop balanced pro-portions of individual work, subgroup activ-ities, and full-group work.5 The same studyfound the range and combination of infor-mation devices available determined, to a sig-nificant extent, a team’s flexibility in relationto working in different modes.

This stresses how important it is to providethe methods and tools to support differentwork phases in teams. Besides the need fornew software tools, this variety of work phas-es influences other parts of the work environ-ment. In the past, the role of architecture andoffice buildings addressed, to a large extent,the needs of individual work. In the future,work environments will increasingly addressthe needs for collaboration and communica-tion. This evolution raises the need for appro-priate furnishings and equipment. Roomwarecomponents address these needs, providingphysical artifacts that meet the requirementsof flexible reconfigurability.

Software developer challengesObviously, roomware components differ

from desktop PCs in several ways. Interactionin a multiuser, multidevice environment sup-ported by roomware components requiresnew user interface concepts for efficient inter-action. Roomware components developed aspart of the i-Land—an interactive landscapefor creativity and innovation—project baseinteraction on a pen or finger, instead of amouse and keyboard.1

However, all these features pose a challengefor the software developer of roomware appli-

37NOVEMBER–DECEMBER 2002

cations. Whereas several well-establishedmodels, frameworks, and tools exist to aidapplication design for a traditional PC,6

roomware application developers cannot drawupon such tools. They are in the same situa-tion as that of pioneering computer scientistswhen PC use first began to spread.

Roomware componentsOur approach to roomware component

design is to meet, in parallel, the requirementsof flexible configuration and dynamic resourceallocation in physical and information envi-ronments. The term roomware was original-ly coined by Streitz and his Ambiente team5

and is now a registered Fraunhofer trademark.However, it also applies to a general charac-terization of this approach and to products inthis area.

The general goal of developing roomwareis to design integrated physical and virtualinformation spaces. In the context of sup-porting teamwork, roomware componentsshould let users tailor and compose them toform cooperation landscapes, serving multiplepurposes such as project team rooms, presen-tation suites, learning environments, infor-mation foyers, and so on. These goals have incommon the requirement of developing soft-ware that enables new forms of multiuser,multiple-display, human-computer interac-tion and cooperation.

In 1997, we began workon a testbed for experiment-ing with roomware compo-nents. We call thisenvironment i-Land. In1999, as part of the researchand development consortiumFuture Office Dynamics( h t t p : / / w w w. f u t u r e -office.de), we developed—together with industry part-ners—the second generationof roomware. We redesignedexisting components anddeveloped an additional com-ponent called ConnecTable.Figure 1 shows this and otherroomware devices: an inter-active electronic wall(DynaWall), interactive elec-tronic table (InteracTable),

and mobile and networked chairs with anattached interactive display (CommChairs).

We developed the Basic Environment forActive Collaboration with Hypermedia(Beach) software framework as infrastructureto support synchronous collaboration withroomware components.7 Beach offers a userinterface that addresses the needs of devicesthat have no mouse or keyboard, and requirenew forms of human-computer and team-computer interaction. To allow synchronouscollaboration, Beach builds on shared docu-ments concurrently accessible via multipleinteraction devices.

Large group activities: DynaWallThe DynaWall provides a large white-

board-like display that serves the needs ofteams working in project and meetingrooms. It is the electronic equivalent of largeareas of assembled sheets of paper coveringthe walls for creating and organizing infor-mation. The current realization consists ofthree segments with back projections andlarge touch-sensitive display surfaces(http://www.smarttech.com). The total dis-play size of 4.50 × 1.10 meters can cover oneside of a room (see Figure 1). Although dri-ven by three computers, Beach provides onelarge, homogeneous workspace with nointeraction boundaries between the seg-ments. Two or more people can either work

38

ROOMWARE

IEEE MICRO

ConnecTables

CommChairInteracTable

DynaWall

Figure 1. Second generation of roomware.

individually in parallel or share the entire dis-play space.

To acknowledge the large visual surface,Beach supports spatial hypertext documents.workspaces can be positioned freely on thesurface. As the touch-sensitive displays enabledirect interaction with the documents usingpen or finger, users draw strokes rather thanclicking (that is, tapping) on the surface.Beach therefore allows drawing informal scrib-bles. It also allows pen gestures to invoke com-mands. Users can gesture to create, forexample, new workspaces or hyperlinks, orremove objects.

In addition to the general design of a pen-based user interface, DynaWall’s size createschallenges for human-computer interaction.For example, it would be cumbersome for auser to drag an object or a window over a dis-tance of more than 4 meters by having totouch the DynaWall the entire time (similar toholding down the mouse button). We devel-oped two mechanisms to address these typesof problems. Similar to physically picking upan object and placing it somewhere else, ourtake-and-put feature lets users take informa-tion objects at one position, walk over to a dif-ferent place, and put them somewhere else onthe display. For bridging the distance betweenseveral closely cooperating people duringgroup brainstorming, for example, the shufflefeature lets users throw an object from one sideof the display to the other, where another teammember can catch it (by touching it).

Individual work in a group context: CommChairA CommChair, shown in Figure 2, is a

mobile chair with a built-in computer. Thesechairs represent a new type of furniture, com-bining the mobility and comfort of armchairswith information technology. CommChairslet people communicate and share informa-tion with people in other CommChairs, stand-ing in front of the DynaWall, or using otherroomware components. They can make per-sonal notes in a private space but also interactremotely on shared (public) workspaces, forexample, making remote annotations at theDynaWall. Beach software provides the coop-erative sharing functionality. To offer flexibil-ity and mobility, each chair has a wirelessnetwork (we currently use an IEEE 802.11WaveLAN) and independent power supply.

Small-group collaboration: InteracTableThe InteracTable interactive table provides

a space for creation, display, discussion, andannotation of information objects. It’s suit-able for use by groups of up to six peoplestanding around it. It consists of a large plas-ma display panel with a touch-sensitive sur-face (http://www.smarttech.com) embeddedin the top of a stand-up table. People can writeand draw on it with a pen and interact via fin-ger or pen gestures with information objects.Users with a need for extensive text input canuse a wireless keyboard.

The InteracTable is an example of howdesigners need to adapt the user interface todifferent form factors. Its horizontal setup dis-play lets people stand around it, resulting in aninteraction area with no predefined orienta-tion. In contrast, vertical displays such as desk-top computer monitors have a dedicated topand bottom, left and right. Thus, horizontaldisplays require new forms of human-com-puter interaction. To this end, Beach providesgestures for rotating individual informationobjects or groups of objects. This accommo-dates the need for easy viewing from all per-spectives.

Furthermore, a user can create a second

39NOVEMBER–DECEMBER 2002

Figure 2. Tight collaboration using a CommChair in front ofa DynaWall. The user in the chair can remotely annotate thewall by simply writing on the display attached to the chair.

view of an object; this view stays synchronizedwith the original view.7 A user can shuffle thisview to a colleague standing on the other sideof the table, so this team member will alwayshave a personal view of the same object in thepreferred orientation. Now, both users canview, edit, and annotate the same object inparallel, as Figure 3 shows.

Transition from individual to group work:ConnecTable

The ConnecTable component is part of thesecond generation of roomware. It eases thetransition between individual work and small-group cooperation. Users can adapt the dis-play height to accommodate different

working situations, such as standing up or sit-ting. They can tilt the display to differentangles, providing an optimal view. In thisstand-alone mode, it is similar to a Comm-Chair. To initiate tight collaboration, twousers can move their ConnecTables together,and arrange them to form a large display area,as shown in Figure 1. This makes the Con-necTables suitable for small-group work, justlike the InteracTable.

To detect adjacent tables, integrated sensorsmeasure the distance between the Connec-Tables and initiate the automatic coupling ofthe displays once they are close enough. Thecurrent implementation uses a radio-frequen-cy-based sensor, shown in Figure 4a. A coilintegrated with the tabletop can detect radiofrequency identification tags, shown in Figure4b, when the tag’s distance falls within a cer-tain threshold. Whereas this implementationis simple to realize and can reliably detect othertables, it’s only possible to connect two Con-necTables. However, adding coils and tags oneach side of the tabletop would make it possi-ble to connect more tables.

Beach software lets workgroups use theresulting large display area as a common work-space,8 employing the same technology as thatfor coupling the DynaWall’s segments. Sever-al people can then concurrently work andseamlessly move information objects across thephysical borders of the individual displays. Likethe InteracTable, people can create secondviews, shuffle them from one ConnecTable toanother, rotate them there, and work on themin parallel with correct perspectives.

Software application modelTraditional application models don’t pro-

vide enough guidance for developers to cre-ate software systems, such as Beach, forroomware environments. Software develop-ers must consider further aspects not relevantfor software running on a desktop PC withmonitor, mouse, and keyboard as standard-ized interaction devices.

Our proposed application model, shown inFigure 5, accounts for the properties ofroomware and ubiquitous computing envi-ronments. It’s based on three design dimen-sions, describing orthogonal aspects ofroomware applications. The first designdimension entails five models that separate

40

ROOMWARE

IEEE MICRO

(a) (b)

Coil

Tag

Sensor

Figure 3. To support horizontal displays, Beach lets different users rotatedocuments to a preferred orientation.

Figure 4. Sensor technology integrated into the Connec-Table (a). A coil and tag at the top of the display detectother tables (b).

the basic concerns ofroomware applications andmake up the applicationmodel’s structure. This struc-ture provides reusable com-ponents and hooks foradapting to or for differentdevices. The second dimen-sion entails the degree of cou-pling and the aspects ofsharing information betweendevices. The third dimensionentails the four levels thatorganize the applicationmodel. These levels definedifferent levels of abstractionsfor software functionality.

Separating basic concernsClearly separating different

responsibilities within thesoftware helps provide the flexibility that dif-ferent devices need. Therefore, we distinguishmodels for the data, application, user inter-face, environment, and interaction, as Figure5 shows.

The data model specifies the type of data thatusers can create and interact with. To workwith data, the application model provides thenecessary functionality. These two models areindependent of the currently used or support-ed hardware device. Instead, the environmentmodel describes available devices and other rel-evant parts of the environment. The user inter-face model defines the framework for how thesoftware can present its functionality to theuser, taking into account the environmentmodel’s properties. These models are applica-ble to other applications besides those for ubiq-uitous computing and roomware, as well. Yet,because of the heterogeneous environmentubiquitous computing applications operate in,they have a strong need for a structure that isclear yet flexible enough to adapt componentsindependently for different situations.

Data model: Information objects. A commonapproach in application modeling is to sepa-rate the application model from the data,domain, or business object model.9 The datamodel relates to the information dimensionidentified by Jacobson et al.,10 while the appli-cation model represents the behavior dimen-

sion. This way, software developers can inde-pendently reuse both data and applicationmodels. They can specify and implement dif-ferent applications for an existing data model.This reuse can save time if the current appli-cation domain has complex data structures oralgorithms. Conversely, developers can reuseapplication models for different types of data,if they carefully define the interface betweenthe application and the data at an appropri-ate level of abstraction.

Application model: Application behavior. Appli-cation models describe all platform- and inter-face-independent application aspects, such asmanipulation of data objects. As applicationmodels define the application’s behavior, theyspecify control objects as defined by Jacobsonet al.10 For a text object, the data model includesthe string describing the text and text attributessuch as font or size. The application model addsthe text’s editing state, such as cursor positionor selection. Applications can give awareness,for example, displaying the cursors of otherusers, if they have access to a shared editingstate.11 To use different application models forthe same data model, the data model mustremain unaware of any application model, butonly represent the document state.

Researchers and software developers havefound it helpful to choose a fine granularityfor some application models. This way, devel-

41NOVEMBER–DECEMBER 2002

Data model

Application

Task

Model

Generic

Core Local class Shared class

User interfacemodel

Environmentmodel

Interactionmodel

Separationof concerns

Degree of coupling and sharing

Level of abstraction

Figure 5. Application model organized into four levels of abstractions and five models thatseparate basic concerns. Crucial for synchronous collaboration is the sharing dimension,responsible for distributed access to shared objects.

opers can aggregate low-level applicationmodels with a well-defined functionality (forexample, to edit simple text) into more com-plex models at a higher level of abstraction(for example, an editor that can manage com-plete workspaces). Usually, a whole hierarchyof application models composed of generic,reusable, and custom parts constitutes anapplication.11 This way, the application modeloften forms a hierarchy that is isomorphicwith respect to the containment hierarchy ofits associated data model.9

Using small application models fostered anew conception of what developers regardedas an application. We view the applicationmodel as a description of additional seman-tics for a data model, instead of the conven-tional view of data as a supplement thatapplications will edit. This change in view-point, therefore, leads to an information-cen-tric perspective of application models.

In the context of roomware environments,it’s essential that developers do not includeuser interface and environment aspects in theapplication model. Enforcing a strict separa-tion between application model and device-dependent aspects makes it possible to reuseapplication models with different user inter-faces and within a different environment.

Environment model: Context awareness. Onemajor property of ubiquitous computingenvironments is the heterogeneity of the avail-able devices. To provide a coherent user expe-rience, “the system must have a deeperunderstanding of the physical space.”12 Thisraises the need for an adequate model of theapplication’s physical environment.

Therefore, the environment model is therepresentation of relevant parts of the realworld. This includes a description of thedevices themselves, their configuration, andtheir capabilities. This is the direct hardwareenvironment, which the user interface modelcan employ in adapting to different devices.In addition, the environment model caninclude other aspects if these aspects influencethe software’s behavior. Depending on detect-ed changes in the physical environment, thesoftware can trigger further actions to reflectthe current situation. An example of this is theway ConnecTables establish a common work-space when placed next to each other.

Besides the physical environment, othercontextual information, such as the currenttask, project, or coworker presence, couldinfluence the software’s behavior—insofar asthis information is available to the applica-tion. We refer to this type of contextual infor-mation as the logical context of theapplication.3 Currently, it’s difficult for soft-ware applications to grasp the physical envi-ronment and logical context. Further workmust establish how to capture sufficient infor-mation about the current environment and todefine appropriate models (for example, as inSousa and Garlan4).

User interface model: Interface objects. Becausetraditional operating and window manage-ment systems are suited for a traditional desk-top PC, their user interfaces have drawbackswhen used with devices without a mouse andkeyboard, or those having different forms andsizes. For instance, if a menu bar were alwaysat the top of the screen in a wall-sized displaysuch as DynaWall, shown in Figure 1, userswould find it difficult to reach. Alternatively,a toolbar takes up a lot of precious screen spaceon a small device, such as a personal digitalassistant.

Accordingly, the user interface model coulddefine alternative user interface concepts suit-able for different interaction devices, forexample, rotation of user interface elementson horizontal displays. To choose an appro-priate user interface, the user interface modelcan draw on information provided by theenvironment model. An explicit model of anappropriate user interface addresses all issuesrelated to the hardware and the physical envi-ronment, making applications and documentsdevice and environment independent.

Still, the user interface model must notenforce a dedicated presentation or interac-tion style; this is the responsibility of the inter-action model. Rather, the user interface modelconcentrates on the elements offered for inter-action. These elements can be a device-inde-pendent representation of user interfacewidgets or interactors. Figure 6 illustrates thedependencies between data, application, envi-ronment, and user interface models.

Interaction model: Presentation and interaction.To support different styles of interaction, it’s

42

ROOMWARE

IEEE MICRO

crucial to separate interaction issues from allother aspects of an application. The interac-tion model is the only place that specifies pre-sentation aspects or interaction style. Thisway, a software system can adapt its presenta-tion for different contexts, for example, byusing a pop-up menu instead of a list box. It’salso possible to choose a different representa-tion—when no display is available, voice-based interaction might still be possible.

Hence, the interaction model defines a wayto interact with all other basic models, asshown in Figure 6. An appropriate interactionstyle depends on the available interactiondevices and the associated user interface. Soft-ware can choose a suitable interaction model,depending on the environment and user inter-face models.

When designing an interaction model, thesoftware developer has to choose an architec-tural style that is appropriate for the support-ed interaction style. For visual interaction,researchers have successfully used an adaptedversion of the model-view-controller style; thisstyle separates input and output explicitly.13

Views render a visual representation of theirmodel. Watching for model changes, the viewupdates the representation whenever themodel is modified. Controller objects receiveinput events and modify their associatedmodel accordingly. This way, the model needsno information about how it is visualized, orhow users can trigger functionality.

Coupling and sharingAiming at synchronous collaboration, tra-

ditional computer-supported cooperativework (CSCW) or groupware systems14 havetwo crucial aspects: access to shared data andthe ability to couple the applications of col-laborating users. Obviously, this couplingmust apply to both data and application mod-els for software running in a distributed envi-ronment.11

In the context of ubiquitous-computingenvironments, we must extend this view. Inaddition to data and application, differentdevices and applications must exchange infor-mation about the physical environment, suchas the presence of nearby users or other avail-able interaction devices. The user interfacecan be distributed among several machines oramong complementing devices. Beach soft-ware explores these additional issues.

Sharing the data model: Collaborative dataaccess. To access and work with common doc-uments, researchers widely agree that a sharedmodel for documents reduces the complexityin dealing with distributed applications.13 Inthe example of a team member sitting in aCommChair and working with another mem-ber at the DynaWall, both users have accessto the same information objects and can mod-ify them simultaneously.

Sharing the application model: Workspaceawareness. As an easy way of sharing informa-tion about the editing state of other users,researchers have proposed sharing the appli-cation model as well as the data model.11 Shar-ing the editing state allows accessing

43NOVEMBER–DECEMBER 2002

Figure 6. Dependencies between data, application, environment, user interface, and interac-tion models. The user interface model can draw on information available in the environmentmodel to define an application’s interface.

Environmentmodel

User interfacemodel

Applicationmodel

Datamodel

Contextawareness,

devices,and tasks

Presentationand

interactionstyle

Interactionmodel

User interfaceobjects

Applicationbehavior

Informationobjects

information about who is working on whichdocument, providing awareness informationto collaborating team members. For example,activity indicators15 can provide visual feed-back for actions performed at a DynaWall bya user sitting in a CommChair.

By changing the state of the applicationmodel, the software can control possible workmodes such as the degree of coupling. Whentwo users in Beach share the same workspacebrowser, they couple their navigation; whenone user switches to another workspace, allusers sharing the same application model willfollow.7

Sharing the environment model: Environmen-tal awareness. When several people and devicesphysically share a common environment, it’sobvious that the applications used in such sit-uations should also have a shared model ofhow their environment looks.

In ubiquitous-computing environments,many devices have sensors that grasp someaspects of the physical environment. By com-bining all available information and making itaccessible to other applications, each applica-tion draws on context information that it canuse to adapt its behavior. Thus, a shared envi-ronment model can serve as the basis for envi-ronment or context awareness.

When someone places roomware compo-nents, such as two ConnecTables, next to eachother, the ConnecTables update their sharedenvironment model using the information

detected by sensors. As soon as Beach observesthis change, it triggers functionality to con-nect the two displays to form a homogeneousinteraction area.8 Currently, the involved sen-sors are attached to computers built into theConnecTables; future work could replace oraugment this setup. A sophisticated objecttracking system, for example, as described inBrummit et al.,12 involves computers inte-grated into the environment. Here, a sharedenvironment model enables arbitrary com-puters to update the information.

Sharing the user interface model: Distributeduser interface. We want a visual interaction areato cross the borders between adjacent displays(as in our realization of the DynaWall, or withConnecTables) but connected to differentmachines. To do so, the user interface ele-ments must move freely between the differ-ent displays, as Figure 7 shows. In this case,different machines must share user interfaceelements.

Furthermore, if one user interacts with dif-ferent devices at the same time, it’s useful tocoordinate the devices’ user interfaces. This isonly possible if all involved devices can accessinformation about the current user interfaceelements. For instance, a user sitting in aCommChair in front of a large DynaWall canview all information at the chair and on thewall at the same time. Consequently, the userwould benefit if he could modify the infor-mation visible on the wall and remotely con-trol the entire user interface.

Depending on how much state collaborat-ing users share, applications can control thedegree of coupling. Sharing all involved userinterfaces and application states produces atightly coupled collaboration mode; sharingonly the same data model creates a looselycoupled environment.11

Linking the interaction and shared models.Implementing data, tool, user interface, andenvironment models as shared objects givesseveral users or devices simultaneous accessto these objects. However, objects for theinteraction model must exist locally on eachmachine. This is necessary because the inter-action model’s objects must communicatewith the locally available interaction devices.Moreover, a local interaction model lets each

44

ROOMWARE

IEEE MICRO

Figure 7. When placed next to each other,two ConnecTables allow seamless move-ment of user interface elements from onedevice to another. This is realized by usinga shared user interface model.

client adapt the interaction style according toits local context, especially to its physical envi-ronment and interaction capabilities. Tandleret al. give an extensive example of how localinteraction objects can adapt to their localcontext.8

Although the interaction model is local toevery machine, for synchronous collaboration,the generated presentation must be consistentwith the underlying models. Therefore, Beachuses a dependency mechanism, similar to theone provided by Amulet (http://www-2.cs.cmu.edu/~amulet), to link the output ofthe interaction model to the shared models itpresents. Beach ensures that refresh andrecomputation begin as soon as an observedmodel changes.

Sharing environment, user interface, andapplication models lets all clients access theinformation encapsulated in the models. Thiscan provide awareness information to the useras part of the interaction model. Typical forCSCW applications is the provision of work-space or activity awareness.15 This can easily berealized by sharing the application model,including all editing state.11 A shared userinterface model can be used to implementtightly coupled user interfaces. However, analways tightly coupled user interface can beinconvenient to use. Therefore, shared userinterface information can instead supply addi-tional awareness hints to remote users. Beyondthe provision of awareness in traditionalCSCW systems, sharing the environmentmodel enables a new type of awareness—envi-ronmental awareness—for ubiquitous com-puting environments.

Conceptual levels of abstractionThe third dimension of the application

model is the abstraction level. Separating soft-ware into levels of abstraction is a commonsoftware engineering technique; it reduces thecomplexity of each level16 and ensures inter-operability.17

For example, a core functionality of theinteraction model, such as handling physicalinteraction devices, belongs to a very low level.Based on this functionality, higher levelsdefine abstractions, such as widgets or logical-device handlers. High-level interaction com-ponents use these abstractions to define theuser’s access and interaction possibilities for

some other model at the same level of abstrac-tion. The application model presented hereproposes four levels of abstraction.

Core level: Platform-dependent infrastructure.The core level provides functionality thatmakes higher-level development convenientby abstracting from the underlying hardwareplatform.

Roomware applications require additionalfunctionality that is unavailable from off-the-shelf libraries or toolkits. This functionalityincludes support for multiuser event handlingor low-level device and sensor management.For Beach, this includes implementation ofthe shared-object space and the dependencymechanism.

Model level: Basic separation of concerns. Themodel level provides basic abstractions thatcan serve as the basis for the definition of high-er-level abstractions. Here, for example, Beachimplements the model-view-controller stylefor the interaction model.

Generic level: Reusable functionality. Oneimportant goal of every software system is toprovide generic components suited for manydifferent situations and tasks. Therefore, soft-ware developers should group models and con-cepts that apply to a whole application domainat a generic level. This way, the software devel-oper must think about generic concepts, whichwill lead to the implementation of reusable ele-ments. At the generic level, Beach definesgeneric document elements, such as work-spaces, text, scribbles, or hyperlinks.

Task level: Tailored support. When a concep-tual application model defines only genericelements, this restricts the application’s usabil-ity to some degree. Some tasks require spe-cialized support. Therefore, our conceptualmodel has a task level that groups all high-level abstractions unique to small applicationareas. For example, we have implemented sup-port for creative sessions on top of Beach.18

Our initial experiences are quite promis-ing. In addition to our Ambiente Lab in

Darmstadt, we had two external installationsof i-Land: at the Deutsche Arbeits-schutzausstellung (DASA), or German Occu-

45NOVEMBER–DECEMBER 2002

pational Safety and Health Exhibition, inDortmund, which is ongoing, and atWilkhahn in Bad Münder, which lasted forfive months. Both installations were part ofregistered projects of the world exhibitionEXPO 2000. In 2000, second-generationroomware components won the Internation-al Design Award of the state Baden Würt-temberg in Germany.

Given the structure suggested by the appli-cation model, we have restructured Beach toconsist of layered frameworks. With theseframeworks, we are currently developing fur-ther applications tailored for roomware com-ponents (http://ipsi.fhg.de/ambiente).

We consider architectural space as a guidingmetaphor for designing environments thatsupport cooperation between humans andtheir interaction with information. This is animportant perspective for us. Innovative formsof interaction, such as throwing informationobjects on large interactive walls, provide intu-itive forms of cooperation and communica-tion. Nevertheless, it remains to be seen howfar the use of these concepts and metaphorswill actually carry.

We are continuing this research in theAmbient-Agoras project, which is part of theEuropean-Union-funded Disappearing Com-puter initiative. MICRO

AcknowledgmentsWe thank Jörg Geißler, Torsten Holmer,

and Christian Müller-Tomfelde as well asmany of our students for their valuable con-tributions to various parts of the i-Land pro-ject and the Ambiente division. Likewise, wethank the IEEE Computer Society’s review-ers for their extensive and helpful comments.Furthermore, we appreciate the cooperationwith Heinrich Iglseder, Burkhard Remmers,Frank Sonder, and Jürgen Thode from theGerman office furniture manufacturerWilkhahn; and Michael Englisch from theirdesign company, WIEGE. They worked withus in the context of the Future Office Dynam-ics consortium (http://www.future-office.de),which sponsored part of this work.

References1. N.A. Streitz et al., “i-LAND: An Interactive

Landscape for Creativity and Innovation,”

Proc. Conf. Human Factors in Computing

Systems (CHI), ACM Press, New York, 1999,

pp. 120-127; http://ipsi.fhg.de/ambiente/

publications.

2. M. Weiser, “The Computer for the 21st

Century,” Scientific American, Sept. 1991,

pp. 94-104; http://www.ubiq.com/hypertext/

weiser/SciAmDraft3.html.

3. A. Schmidt, M. Beigl, and H. Gellersen,

“There is More to Context than Location,”

Computers & Graphics, vol. 23, no. 6, Dec.

1999, pp. 893-902; http://www.elsevier.com.

4. J.P. Sousa and D. Garlan, “Aura: An

Architectural Framework for User Mobility in

Ubiquitous Computing Environments,” Proc.

3rd Working IEEE/IFIP Conf. Software

Architecture, Kluwer Academic, Boston,

2002, pp. 29-43; http://www.cs.cmu.edu/

~aura/.

5. N.A. Streitz, P. Rexroth, and T. Holmer, “Does

‘roomware’ Matter? Investigating the Role of

Personal and Public Information Devices and

their Combination in Meeting Room

Collaboration,” Proc. European Conf.

Computer-Supported Cooperative Work (E-

CSCW), Kluwer Academic, Amsterdam, 1997,

pp. 297-312; http://ipsi.fhg.de/ambiente/

publications.

6. B.A. Myers, “User Interface Software

Tools,” ACM Trans. Computer-Human

Interaction, vol. 2, no. 1, Mar. 1995, pp. 64-

103; http://doi.acm.org/10.1145/200968.

200971.

7. P. Tandler, “Software Infrastructure for

Ubiquitous Computing Environments:

Supporting Synchronous Collaboration with

Heterogeneous Devices,” Proc. Ubiquitous

Computing (UbiComp), LNCS vol. 2201,

Springer, New York, 2001, pp. 96-115.

8. P. Tandler et al., “ConnecTables: Dynamic

Coupling of Displays for the Flexible Creation

of Shared Workspaces,” Proc. 14th Ann.

ACM Symp. User Interface and Software

Technology (UIST), vol. 3, no. 2, CHI Letters,

ACM Press, New York, 2001, pp. 11-20;

http://ipsi.fhg.de/ambiente/publications.

9. VisualWorks User’s Guide, rev. 2.0 (software

release 2.5), ParcPlace-Digitalk, Palo Alto,

Calif., 1995.

10. I. Jacobson et al., Object-Oriented Software

Engineering: A Use Case Driven Approach,

Addison-Wesley Professional, Boston, 1992.

11. C. Schuckmann, J. Schümmer, and P. Seitz,

“Modeling Collaboration using Shared

46

ROOMWARE

IEEE MICRO

Objects,” Proc. Int’l ACM Siggroup Conf.

Supporting Group Work, ACM Press, New

York, 1999, pp. 189-198; http://www.

opencoast.org.

12. B. Brummit et al., “Easyliving: Technologies

for Intelligent Environments,” Proc. 2nd Int’l

Symp. Handheld and Ubiquitous Computing

(HUC), LNCS vol. 1927, Springer-Verlag,

Heidelberg, Germany, 2000, pp. 12-29.

13. W.G. Phillips, Architectures for Synchronous

Groupware, tech. report 1999-425, Dept.

Computing and Information Science,

Queen’s University, Kingston, Ontario,

Canada, 1999; http://phillips.rmc.ca/greg/pub.

14. C.A. Ellis, S.J. Gibbs, and G.L. Rein,

“Groupware—Some Issues and

Experiences,” Comm. ACM, vol. 34, no. 1,

Jan. 1999, pp. 38-58.

15. C. Gutwin and S. Greenberg, “Design for

Individuals, Design for Groups: Tradeoffs

between Power and Workspace

Awareness,” Proc. ACM Conf. Computer

Supported Cooperative Work, ACM Press,

New York, 1998, pp. 207-216;

http://doi.acm.org/10.1145/289444.289495.

16. L. Nigay and J. Coutaz, “Building User

Interfaces: Organizing Software Agents,”

Esprit 91 Conf. Proc., ACM Press, New York,

1991, pp. 707-719; http://iihm.imag.fr/publs/

1991/.

17. J.I. Hong and J.A. Landay, “An Infrastructure

Approach to Context-Aware Computing,”

Human-Computer Interaction, vol. 16, no. 2-

4, Dec. 2001, pp. 287-303.

18. T. Prante, C. Magerkurth, and N.A. Streitz,

“Developing CSCW Tools for Idea Finding:

Empirical Results and Implications for

Design,” Proc. ACM 2002 Conf. Computer

Supported Cooperative Work (CSCW), ACM

Press, New York, 2002; http://ipsi.fhg.de/

ambiente/publications.

Peter Tandler is a scientific staff member ofthe Ambiente division of the Fraunhofer Inte-grated Publication and Information SystemsInstitute (IPSI). He leads software develop-ment within the Beach and i-Land projects.His research interests include synchronousgroupware, integration of virtual and physi-cal environments, new forms of human-com-puter and team-computer interaction forroomware, software architecture, program-ming languages, object-oriented frameworks,

and object-oriented design and programming.Tandler is working on a PhD in the context ofapplication models and software infrastruc-ture for roomware environments at Fraun-hofer IPSI. He has a Dipl.-Inform. incomputer science from the Darmstadt Uni-versity of Technology, Germany.

Norbert Streitz is the head of the researchdivision at Ambiente—Workspaces of theFuture, which he founded to initiate work onroomware and cooperative buildings at Fraun-hofer IPSI. He also teaches in the computerscience department of the Technical Univer-sity Darmstadt. He is the chair of the steeringgroup of the European research initiative TheDisappearing Computer (DC) and managerof the DC project Ambient Agoras. Hisresearch interests include human-computerinteraction, hypermedia, computer-support-ed cooperative work, ubiquitous computing,user-centered design of smart artifacts, andthe relationship between real and virtualworlds. Streitz has an MSc and PhD in physicsand a second PhD in psychology. He is amember of the German Society of Comput-er Science (GI) and the German Society ofPsychology (DGP).

Thorsten Prante is a scientific staff member ofthe Ambiente division at Fraunhofer IPSI. Healso coordinates activities of the Future OfficeDynamics consortium, and he teaches at theDarmstadt University of Technology in thedepartments of computer science and archi-tecture. His research interests include human-computer interaction and computer-supportedcooperative work, focusing on user interfacesfor cooperative software in ubiquitous com-puting. Prante has a Dipl.-Inform. in comput-er science with minors in architecture andsoftware ergonomics.

Direct questions and comments about thisarticle to Peter Tandler, Fraunhofer IPSI, Ambi-ente division, Dolivostr. 15, 64293, Darmstadt,Germany; [email protected].

For further information on this or any othercomputing topic, visit our Digital Library athttp://computer.org/publications/dlib.

47NOVEMBER–DECEMBER 2002