actuated peripherals as tangibles in desktop interaction

5
HAL Id: hal-01533082 https://hal.inria.fr/hal-01533082 Submitted on 5 Jun 2017 HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci- entific research documents, whether they are pub- lished or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et à la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés. Actuated Peripherals as Tangibles in Desktop Interaction Thomas Pietrzak, Gilles Bailly, Sylvain Malacria To cite this version: Thomas Pietrzak, Gilles Bailly, Sylvain Malacria. Actuated Peripherals as Tangibles in Desktop Inter- action. Proceedings of the European Tangible Interaction Studio (ETIS 2017), Jun 2017, Luxembourg, Luxembourg. pp.4. hal-01533082

Upload: others

Post on 06-Nov-2021

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Actuated Peripherals as Tangibles in Desktop Interaction

HAL Id: hal-01533082https://hal.inria.fr/hal-01533082

Submitted on 5 Jun 2017

HAL is a multi-disciplinary open accessarchive for the deposit and dissemination of sci-entific research documents, whether they are pub-lished or not. The documents may come fromteaching and research institutions in France orabroad, or from public or private research centers.

L’archive ouverte pluridisciplinaire HAL, estdestinée au dépôt et à la diffusion de documentsscientifiques de niveau recherche, publiés ou non,émanant des établissements d’enseignement et derecherche français ou étrangers, des laboratoirespublics ou privés.

Actuated Peripherals as Tangibles in DesktopInteraction

Thomas Pietrzak, Gilles Bailly, Sylvain Malacria

To cite this version:Thomas Pietrzak, Gilles Bailly, Sylvain Malacria. Actuated Peripherals as Tangibles in Desktop Inter-action. Proceedings of the European Tangible Interaction Studio (ETIS 2017), Jun 2017, Luxembourg,Luxembourg. pp.4. �hal-01533082�

Page 2: Actuated Peripherals as Tangibles in Desktop Interaction

Actuated Peripherals as Tangibles in Desktop Interaction

Thomas Pietrzak1, Gilles Bailly2, Sylvain Malacria3

1University of Lille, France, 2CNRS, France, 3Inria, [email protected], [email protected], [email protected]

ABSTRACTTUIs use everyday physical objects to interact with digitalinformation. With decades of usage, computer peripheralsbecame everyday physical objects. We observed that usersmanipulate them for other purpose than input and output de-vices. For example users turn their screen to avoid sun reflec-tions, or move their keyboard ans mouse because they needspace on their desk. In this work we see computer peripher-als as everyday objects, and use them as TUIs. This paperpresents two levels of tangible interaction with desktop com-puters: the first one is a keyboard with actuated keys. Thekeys can raise from their initial position, which can be usedto represent interaction or extend interaction with keyboards.On the second level we actuated a mouse, a keyboard and ascreen so that they can move around on the desk. We presentscenarios showing how it extends interaction with a desktopcomputer setup.

Author Keywordsactuated devices, TUI, tangible interaction, desktopinteraction

INTRODUCTIONIn the early days of tangible interaction, Ullmer and Ishii de-scribed Tangible interaction this way: “TUIs will augment thereal physical world by coupling digital information to every-day physical objects and environments.”[18]. The idea is tobreak the barrier between the physical and the digital world.With this paradigm, any object can either represent digital in-formation or be a proxy for manipulating digital information.Similarly to input and output devices, these objects are instru-mented with sensors or actuators, which are the links with thedigital world.

The question whether a computer mouse is a TUI is interest-ing. On one hand it complies with Ullmer and Ishii’s def-inition. Since the introduction of this definition, computerperipherals became everyday objects. On the other hand,the computer mouse was specifically designed for interactionwith digital information, and would not exist otherwise.

Now, consider the situation of a user having a talk with aslideshow. He holds the mouse in the hand and just use thebuttons to move to the next slide, the same way he would dowith a remote control. In this case he is not using the mouseas it was designed for. Is it sufficient to consider the computermouse as a TUI in this specific scenario?

Further, now imagine an actuated computer mouse so that itcan move around on the table. This mouse moves around onthe desk to give the user notifications when he is not watchingthe screen. In this situation, the mouse is clearly not used asthe mouse was designed to be used. In this work we exploreactuation and motion as a way of interacting with computerperipherals in the way they were not designed for. We discussscenarios in which computer peripherals are tangible objectsfor interaction with digital information.

RELATED WORKWe describe below evolutions of the desktop interface, theuse of motion as an output modality, and shape changing in-terfaces.

Rethinking desktop interactionThe way we interact with computer peripherals has notchanged much since their invention. Mice, keyboards andscreens have not changed much on an interaction point ofview.

Studies showed that some design choices are questionable.For example Pietrzak et al. studied the impact of the mode de-limiters for keyboard shortcuts by replicating the CTRL andSHIFT on the thumb buttons of the mouse [12]. They ob-served similar performance for keyboard shortcut entry thanwith the keyboard. This means it makes sense to revisit de-sign choices made decades ago.

Research explored additional dimensions to extend the capac-ities of computer peripherals. Rekimoto et al. added capaci-tive sensing to the keys of a keyboard [16]. It enables sensingwhether the user touches a key or not. They propose scenar-ios in which they use this information to display feedforward,and other scenarios which take advantage of this extended vo-cabulary to enhance interaction.

Beyond rethinking desktop devices, Bi et al. used the deskitself for interaction [3]. They extend the peripherals capabil-ities with interaction with the desk, both for multi-touch inputand a projected display. At the opposite, Gervais et al. use ev-eryday objects as viewports, which share or extend computerscreens real estate [6]. These systems explore tangible prop-erties of the desktop environment to extend interaction.

Page 3: Actuated Peripherals as Tangibles in Desktop Interaction

Motion outputMotion is a property of the interaction with an object. It iscommonly used as input values, but we are interested in mo-tion of a physical object as an output modality. Motion asoutput produces visual and haptic feedback.

Löffler et al. designed insect-like desk companions [11].These companions can move around on the desk to give theuser notifications through the visual channel. Authors fo-cused on their affective effect on the user. Interestingly, mo-tion based interfaces can take advantage of both the visualand haptic aspects of movements. Zooids are small robotswhich cooperate to achieve a particular task [9]. In some sit-uations they represents points on a graph. In other situationsthey move an object on a table.

Actuating an object enables dynamic force feedback whenthe user touches it. For example Roudaut et al. explored theidea of actuating a layer over a touchscreen to guide the fingertouching the device [17]. It makes it possible to teach usersgestures, such as gesture keyboard symbols. Other studiesuse motion to encode information. Either the system controlsthe movement [5, 13], or only constrains the movements ofthe user [14]. Similarly to Tactons [4], information is codedby mapping pieces of information to signal parameters suchas amplitude, size or shape. Below, we explain how we usemotion to extend interaction with computer peripherals.

Shape changing interfacesActuating objects also make it possible to change their shape,and therefore their affordances. Knobslider is an example ofinteractive object which is either a button or a slider, depend-ing on its shape [7]. This object was specifically designed tobehave this way. At the opposite, Kim et al. designed an in-flatable mouse [8] which can either give notifications, or beused as an elastic input device for continuous rate control.

ACTUATED PERIPHERALSIn this project we use both the concepts of motion as output,and shape changing interfaces to redesign computer peripher-als. We discuss design rationales on the device level, desktoplevel, and envision extending the concept to en entire room ora house.

Device levelMotion is an essential aspect of interaction with peripher-als. Pointing devices rely on movement measurements. Key-boards use binary key positions as input data. In the Mé-tamorphe project [1], we actuated the keys so that they caneither be up or down (Figure 1). Keys can still be pressed,whether the key is up or down.

This shape changing keyboard has new properties comparedto regular keyboards. When a key is up, the use can push itin four directions, or even pinch it (Figure 2). With a touchsensor all around it, the key could be used as an isometricpointing device such as a trackpoint.

Our previous studies showed that raising keys eases eyes-freeinteraction with the keyboard. Specifically we observed thatusers can easier locate raised keys and surrounding ones.

Figure 1: Métamorphe is a keyboard with actuated keys, which can ei-ther be up or down.

Figure 2: Raised keys have new affordances. They can be pushed orpinched.

The possibilities of such a keyboard go beyond text typingand keyboard shortcuts. Similarly to Relief [10], it is a shapechanging device which can be used to display information.

Desktop levelWe observed people when they use a desktop computer, andidentified situations in which they move their peripherals be-sides interaction with the computer. For example we ob-served people turning their screen to avoid sun reflexions.Other users turned their screen either to show visual contentto somebody, or to show something in the room in a videoconference with the camera affixed to the screen. It is alsofrequent to move the mouse and keyboard to make space onthe desk for something else.

In the Living Desktop project[2], we actuated a mouse, key-board and screen (Figure 3):

• The mouse can translate in the x,y plane directions.

• The keyboard can rotate, and translate in the x,y plane di-rections.

• The screen can rotate, and translate in the x axis direction.

With these capabilities, devices can move on their own with-out requiring the user to move them. The interesting questionhere is the degree of control the user has over his devices.There is a continuum between full control and full automa-tion, in which we identify some particular degrees:

Page 4: Actuated Peripherals as Tangibles in Desktop Interaction

Figure 3: The Living Desktop is a concept in which desktop peripheralscan move around on the desk.

• Telekinesis: the user moves the devices with distant con-trols.

• Tele-operation: the user suggests movements, the devicedecides to which degree it complies.

• Constraint: the user defines the constraints of the devicesmovements.

• Insurrection: the user has no influence on the device move-ments.

We implemented a couple of scenarios, which illustrate theconcept.

Peephole displayEven with a large screen, the interactive screen estate is lim-ited. We propose to use the screen as a peephole display in alarger working space. In this scenario, the screen moves onthe x axis and the pixels show the content in this area in space.The screen is like a moving physical window. In this scenariothe user controls the screen position.

Video ConferenceWhen video-conferencing with a desktop computer, the cam-era is usually affixed to the screen. The problem is when theuser would like to show something he manipulates outside thecamera range. He has to move the screen at the same time heis manipulating. In this scenario the screen follows the user sothat he can always see the video conference, and show whathe is doing to his collaborators. The user does not control thescreen position in the strict sense of the term. However he canactivate or deactivate this behavior and still control the screenposition manually or with another interface.

Ergonomic coachBeing well seated is essential for healthy office work. It re-duces fatigue and pain. It is however difficult to pay attentionto our posture all day. In this scenario, devices move awayif we are not seated correctly on the chair. The user has nocontrol over the devices in this situation.

Going furtherLooking at the office environment, there are many other ob-jects involved. They can be actuated to provide other interac-tive scenarios. Probst et al. presented a prototype of chairthey use for input[15]. These chairs are not actuated, butequipped with sensors. However, Nissan designed parkingchairs1 which can move around. In their scenario the chairsmove back under the table to tidy the room. But we can en-vision other scenarios. Pull-down beds are other examples ofexisting moving objects, which are easy to actuate.

In a larger scale, the concept of moving walls makes it possi-ble to have many rooms in small flats2. Each wall has specificequipment, suitable for a particular room. If we keep thinkbigger, rotating houses is another example of actuated envi-ronment3. The obvious application is to maintain sunlight ata specific location in the house. But there may be many inter-esting interactive scenarios to study with such a building.

CONCLUSIONThe early studies about TUIs used to consider everyday ob-jects for interaction. Nowadays, computer peripherals be-came everyday objects. As such, they can also be consideredas TUIs as long as they are not used as the device they aredesigned to be. We discussed how actuating computer pe-ripherals enables new interactions. We presented a prototypeof keyboard with actuated keys which can move up and down.We also presented a concept of moving computer peripherals,which enable new interactions. We envision this concept canapply to many other objects in our environment. The ques-tion we must always keep in mind is the degree of control wewould like to keep over these wandering TUIs.

REFERENCES1. Bailly, G., Pietrzak, T., Deber, J., and Wigdor, D.

Métamorphe: Augmenting hotkey usage with actuatedkeys. In Proc. CHI’13 (Paris, France, May 2013),563–572.

2. Bailly, G., Sahdev, S., Malacria, S., and Pietrzak, T.LivingDesktop: Augmenting desktop workstation withactuated devices. In Proc. CHI’16 (San Jose, USA, May2016), 5298–5310.

3. Bi, X., Grossman, T., Matejka, J., and Fitzmaurice, G.Magic desk: bringing multi-touch surfaces into desktopwork. In Proc. CHI’11 (Vancouver, Canada, May 2011),2511–2520.

4. Brewster, S. A., and Brown, L. M. Non-visualinformation display using tactons. In E.A. CHI’04(Vienna, Austria, Apr. 2004), 787–788.

5. Enriquez, M., and MacLean, K. E. The hapticon editor:A tool in support of haptic communication research. InProc. HAPTICS 2003 (Los Angeles, USA, Mar. 2003),356–362.

1https://youtu.be/O1D07dTILH02https://vimeo.com/1108716913https://youtu.be/dIHUwp9x8Fg

Page 5: Actuated Peripherals as Tangibles in Desktop Interaction

6. Gervais, R., Roo, J. S., and Hachet, M. Tangibleviewports: Getting out of flatland in desktopenvironments. In Proc. TEI’16 (Funchal, Portugal,2016), 176–184.

7. Kim, H., Coutrix, C., and Roudaut, A. Knobslider:design of a shape-changing device grounded in users’needs. In Proc. IHM 2016 (Fribourg, Suisse, Oct. 2016),91–102.

8. Kim, S., Kim, H., Lee, B., Nam, T.-J., and Lee, W.Inflatable mouse: volume-adjustable mouse withair-pressure-sensitive input and haptic feedback. InProc. CHI’08 (Florence, Italy, Apr. 2008), 211–224.

9. Le Goc, M., Kim, L. H., Parsaei, A., Fekete, J.-D.,Dragicevic, P., and Follmer, S. Zooids: Building blocksfor swarm user interfaces. In Proc. UIST’16 (Tokyo,Japan, Oct. 2016), 97–109.

10. Leithinger, D., and Ishii, H. Relief: a scalable actuatedshape display. In Proc. TEI’10 (Cambridge, USA,2010), 221–222.

11. Löffler, D., Kaul, A., and Hurtienne, J. Expectedbehavior and desired appearance of insect-like deskcompanions. In Proc. TEI’17 (Yokohama, Japan, 2017),289–297.

12. Pietrzak, T., Malacria, S., and Bailly, G. CtrlMouse etTouchCtrl : Dupliquer les délimiteurs de mode sur lasouris. In Proc. IHM 2014 (Lille, France, Oct. 2014),38–47.

13. Pietrzak, T., Martin, B., and Pecci, I. Affichaged’informations par des impulsions haptiques. In Proc.IHM 2005 (Toulouse, France, Sept. 2005), 223–226.

14. Pietrzak, T., Martin, B., and Pecci, I. Informationdisplay by dragged haptic bumps. In Proc. Enactive /05(Genova, Italy, 2005).

15. Probst, K., Lindlbauer, D., Haller, M., Schwartz, B., andSchrempf, A. A chair as ubiquitous input device:exploring semaphoric chair gestures for focused andperipheral interaction. In Proc. CHI’14 (Toronto,Canada, May 2014), 4097–4106.

16. Rekimoto, J., Ishizawa, T., Schwesig, C., and Oba, H.PreSense: interaction techniques for finger sensing inputdevices. In Proc. UIST’03 (Vancouver, Canada, Nov.2003), 203–212.

17. Roudaut, A., Rau, A., Sterz, C., Plauth, M., , andBaudisch, P. Gesture output: eyes-free output using aforce feedback touch surface. In Proc. CHI’13 (Paris,France, May 2013), 2547–2556.

18. Ullmer, B., and Ishii, H. Tangible bits: towards seamlessinterfaces between people, bits and atoms. In Proc.CHI’97 (Atlanta, USA, 1997), 234–241.