smartproducts -interaction strategies

Upload: gabor-karpati

Post on 07-Aug-2018

216 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/20/2019 SmartProducts -Interaction Strategies

    1/92

     

    SmartProductsProactive Knowledge for Smart Products

     

    SmartProducts 

    D.5.1.3: Final Description of Interaction Strategies

    and Mock-Up UIs for Smart Products

    WP 5

    Deliverable Lead: VTT

    Contributing Partners:

    CRF, EADS, PRE, TUD, OU

    Delivery Date: 31.01.2011

    Dissemination Level: Public

    Version 1.0

    Copyright © SmartProducts Consortium 2009-2012

  • 8/20/2019 SmartProducts -Interaction Strategies

    2/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 1

    Copyright © SmartProducts Consortium 2009-2012

    Deliverable Lead

    Name Organisation e-mail

    Jani Mäntyjärvi VTT [email protected]

    Contributors

    Name Organisation e-mail

    Elena Vildjiounaite VTT [email protected]

    Ilkka Niskanen VTT [email protected]

    Marcus Ständer TUD [email protected]

     Aba-Sah Dadzie USFD [email protected]

    Jerome Golenzer EADS [email protected]

    Vanessa Lopez OU [email protected]

    Boris de Ruyter PRE [email protected]

    Julien Mascolo CRF [email protected]

    Internal Reviewer

    Name Organisation e-mail

     Andreas Budde SAP [email protected]

    Disclaimer

    The information in this document is provided "as is", and no guarantee or warranty is given

    that the information is fit for any particular purpose. The above referenced consortiummembers shall have no liability for damages of any kind including without limitation direct,

    special, indirect, or consequential damages that may result from the use of these materials

    subject to any liability which is mandatory due to applicable law. Copyright 2011 by VTT, TUD,

    USFD, EADS, PRE, CRF, OU. 

  • 8/20/2019 SmartProducts -Interaction Strategies

    3/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 2

    Copyright © SmartProducts Consortium 2009-2012

    Table of Contents

    LIST OF FIGURES .................................................................. ...................................................................... ........ 4 

    LIST OF TABLES ........................................................... .................................................................... ................... 7 

    EXECUTIVE SUMMARY ....................................................................................................... ............................. 8 

    1  INTRODUCTION ................................................................. ................................................................... ....... 9 

    2  INTERACTION STRATEGIES ............................................................................................................ ...... 11 

    2.1 

    DESCRIPTION OF I NTERACTION STRATEGIES ........................................................................ ................ 11 

    2.2  I NTERACTION TYPES ............................................................................................................................ 13 

    2.3  A MODEL FOR GENERATING UIS FROM WORKFLOWS .................................................................... ...... 15 

    2.3.1   Displaying Uis ............................................................... ........................................................ 17  

    2.3.2   MUI/SUI based Mock-Up ............................................................. ......................................... 18 

    2.4  FORMAL MODEL FOR USING I NTERACTION TYPES ............................................................................... 24 

    2.4.1   Definition of Sets and the States ............................................................................ ................ 24 

    2.4.2   Definition of Functions .................................................................. ........................................ 25 

    2.4.3   Definition of Operations ................................................................... ..................................... 27  

    3  MOCK-UP UIS ...................................................... ............................................................. ........................... 30 

    3.1  “DEFAULT FUNCTIONALITY” STRATEGY ......................................................... ...................................... 31 

    3.2  “GUIDE THE USER ” STRATEGY .............................................................. ................................................ 34 

    3.3  “ASK THE USER FOR CONFIRMATION” STRATEGY ........................................................................... ...... 44 

    3.4  “ADVICE/  NOTIFY” STRATEGY ............................................................................................................. 51 

    3.5  “R ESPONSE TO THE USER ’S REQUEST” STRATEGY ...................................................... ........................... 55 

    3.6  “EXPLAIN PRODUCT ACTIONS” STRATEGY ...................................................... ...................................... 58 

    3.7  “ACKNOWLEDGE TASK ” STRATEGY ...................................................... ................................................ 67 

    3.8  “SHORT-TERM CUSTOMISATION” STRATEGY ............................................................. ........................... 68 

    3.9  “LONG-TERM CUSTOMISATION” STRATEGY .......................................................................................... 70 

    3.9.1 

     Manual acquisition of user profile ................................................................... ..................... 70 

    3.9.2   Learning ................................................................ ................................................................ 72 

    3.9.3  “Ask for the user’s feedback” strategy ..................................... ............................................ 74 

    4  REQUIREMENTS ....................................................................................................................... ................. 77 

    5  CONCLUSION AND OUTLOOK .................................................................................................... ........... 84 

    ANNEX ............................................................... ............................................................. ...................................... 85 

  • 8/20/2019 SmartProducts -Interaction Strategies

    4/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 3

    Copyright © SmartProducts Consortium 2009-2012

    A  GLOSSARY .................................................................... .............................................................. ................. 86 

    B  LIST OF ACRONYMS ................................................................... .............................................................. 88 

    REFERENCES ..................................................................................................................................................... 89 

  • 8/20/2019 SmartProducts -Interaction Strategies

    5/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 4

    Copyright © SmartProducts Consortium 2009-2012

    List of Figures

    Figure 1: Categories of Interaction Types from [Ständer-2010] ................................................ 13 

    Figure 2: Master and Slave Uis from the Cocktail Companion demonstrator ........................... 16 

    Figure 3: Sequential SUIs ........................................................................................................... 17 

    Figure 4: Parallel SUIs ............................................................................................................... 17 

    Figure 5 : Cocktail Companion MUI/SUI usage ........................................................................ 19 

    Figure 6: Cocktail Companion MUI of the login activity .......................................................... 19 

    Figure 7: The MUI of the login screen in the real setting in the Cocktail Companion .............. 20 

    Figure 8: Cocktail Companion MUI welcome screen ................................................................ 21 

    Figure 9: Cocktail Companion MUI of the cocktail selection activity ...................................... 21 

    Figure 10: Cocktail Companion MUI for a cocktail recipe without any SUIs ........................... 22 

    Figure 11: Cocktail Companion MUI of the cocktail preparation with the steps as SUIs ......... 22 

    Figure 12: Warning when too much vodkas has been added in the Cocktail Companion ......... 23 

    Figure 13: The Cocktail Companion SUI for measuring the amount of filled-in vodka in real

    setting ......................................................................................................................................... 23 

    Figure 14: The Menu showing the sub-menu for browsing and searching the origami folds

    database ...................................................................................................................................... 31 

    Figure 15: Task selection in origami application for a large screen ........................................... 32 

    Figure 16: Task selection in the cooking assistant for a large screen ........................................ 33 

    Figure 17: User login for a small screen in the car assistant ...................................................... 34 

    Figure 18: Recipe guiding in a large screen in the cooking assistant ......................................... 35 

    Figure 19: Recipe guiding in a small screen in the cooking assistant ........................................ 36 

    Figure 20: Guiding for a vehicle component mounting in automotive domain ......................... 36 

    Figure 21: Snow chain mounting Step x in automotive domain ................................................ 37 

    Figure 22: Snow chain mounting context sensing in automotive domain.................................. 38 

    Figure 23: Dual visualisation and synchronisation of displays in automotive domain .............. 38 

    Figure 24: Guiding for “replace wiper” task in the car assistant ................................................ 39 

    Figure 25: Guiding in aircraft assembly ..................................................................................... 40 

    Figure 26: Steps overview in aircraft assembly ......................................................................... 41 

    Figure 27: Guiding via images and text in origami application ................................................. 42 

    Figure 28: Guiding via videos in origami application ................................................................ 42 

    Figure 29: Beginner / Expert process sequence in aircraft assembly ......................................... 43 

    Figure 30: Instructions step – Beginner mode in aircraft assembly ........................................... 44 

    Figure 31: Notification regarding detection of new snow chains on-board and asking for eLUM

    update confirmation .................................................................................................................... 45 

  • 8/20/2019 SmartProducts -Interaction Strategies

    6/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 5

    Copyright © SmartProducts Consortium 2009-2012

    Figure 32: Asking the user for confirmation in the car assistant ................................................ 45 

    Figure 33: Instructing the user to place a cup at the coffee dispenser in the Cocktail Companion

    .................................................................................................................................................... 46 

    Figure 34: Details of instructing the user to place a coffee cup in the Cocktail Companion ..... 47 

    Figure 35: Cocktail Companion actively trying to get feedback from the user ......................... 47 

    Figure 36: Tools and material collection in aircraft assembly ................................................... 48 

    Figure 37: Smart Tool problem report in aircraft assembly ....................................................... 48 

    Figure 38: “Abort” procedure in aircraft assembly .................................................................... 49 

    Figure 39: “Retry” procedure in aircraft assembly..................................................................... 49 

    Figure 40: First reminder in the cooking assistant ..................................................................... 50 

    Figure 41: Repeated reminder in the cooking assistant .............................................................. 50 

    Figure 42: Asking the user to confirm profile update in the origami application ...................... 51 

    Figure 43: Work assignment in aircraft assembly ...................................................................... 52 

    Figure 44: A car servicing advice for cold climate in the car assistant ...................................... 53 

    Figure 45: A cooking advice for a hypertonic user in the cooking assistant .............................. 53 

    Figure 46: A cooking advice for weight watchers in the cooking assistant ............................... 54 

    Figure 47: GUI-based notification that meal is ready in the cooking assistant .......................... 54 

    Figure 48: Response to the user request for detailed information in the car assistant ............... 55 

    Figure 49: Different views on the task in aircraft assembly ....................................................... 56 

    Figure 50: Highlighted entries in the History Log browser in origami application, based on a

    user request to ‘View Log’, in order to decide whether or not to accept the system prompt toupdate their profile ..................................................................................................................... 57 

    Figure 51: Recommendation for the user after changing presentation modality preference to

    ‘Video Only’ in origami application .......................................................................................... 57 

    Figure 52: Question answering tool interface ............................................................................ 58 

    Figure 53: Introducing a sensor-augmented spoon in the cooking assistant .............................. 60 

    Figure 54: Explaining and reminding the availability of context-based support in automotive

    domain ........................................................................................................................................ 60 

    Figure 55: Explanation regarding transition to the next step of instructions on a large screen in

    the cooking assistant ................................................................................................................... 61 

    Figure 56: Explanation regarding transition to the next step of instructions on a small screen in

    the car assistant ........................................................................................................................... 61 

    Figure 57: Explanation regarding disabling of audio output in the cooking assistant ............... 62 

    Figure 58: Explanation regarding message triggering in the cooking assistant ......................... 62 

    Figure 59: Explanation regarding a danger to disable reminders in the cooking assistant ........ 63 

    Figure 60: Explanation regarding a way to combine preferences of multiple users for a large

    screen in the cooking assistant ................................................................................................... 63 

  • 8/20/2019 SmartProducts -Interaction Strategies

    7/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 6

    Copyright © SmartProducts Consortium 2009-2012

    Figure 61: Detailed explanation regarding a way to combine preferences of multiple users for a

    large screen in the cooking assistant .......................................................................................... 64 

    Figure 62: Detailed explanation regarding a way to combine preferences of multiple users for a

    small screen in the cooking assistant .......................................................................................... 64 

    Figure 63: The Menu showing the sub-menu for browsing the history log in origami

    application .................................................................................................................................. 65 

    Figure 64: Browsing a user’s history; the detail is shown for the initial (bottom row) and two

    subsequent entries for the user’s profile in origami application................................................. 65 

    Figure 65: Recommendation in origami application, based on system settings only ................ 66 

    Figure 66: Recommendation in origami application, based on system settings only ................ 66 

    Figure 67: Acknowledging that the smart wrench is ready for fixing in aircraft assembly ....... 67 

    Figure 68: Acknowledgment of eLUM updating performed in automotive domain ................. 67 

    Figure 69: The Menu showing the sub-menu for setting user and system options in origami

    application .................................................................................................................................. 70 

    Figure 70: Eliciting the user’s interaction preferences in origami application .......................... 71 

    Figure 71: Recommendation in origami application, based on user profile ............................... 72 

    Figure 72: Dialog for updating system defaults, showing options for long-term customisation

    for user interaction in origami application ................................................................................. 73 

    Figure 73: Dialog for updating system defaults in origami application, showing options for

    influencing long-term customisation for user interaction, by setting how the system is to log

    user actions. ................................................................................................................................ 74 

    Figure 74: Recommendation feedback form in origami application .......................................... 75 

    Figure 75: Asking for the users’ feedback and consequent configuration in the cooking

    assistant ...................................................................................................................................... 76 

  • 8/20/2019 SmartProducts -Interaction Strategies

    8/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 7

    Copyright © SmartProducts Consortium 2009-2012

    List of Tables

    Table 1: Sets of the Interaction Model ....................................................................................... 25 

    Table 2: Functions of the Interaction Model .............................................................................. 27 

    Table 3: Operations of the Interaction Model ............................................................................ 29 

    Table 4: Fulfilment of requirements from [D5.1.1] ................................................................... 83 

  • 8/20/2019 SmartProducts -Interaction Strategies

    9/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 8

    Copyright © SmartProducts Consortium 2009-2012

    Executive Summary

    This document presents final interaction strategies and mock-up user interfaces for

    smart products. First we describe interaction strategies: a conceptual framework fordescribing behaviour of smart products in different situations. The main interaction

    strategies for smart products are the following:

    •  Provide default functionality: allow users to give a task to a smart product

    •  Guidance: help to achieve the task that consists of multiple steps

    •  Ask the user for confirmation: ask the user to confirm his/her intentions or

    situational changes

    •  Advice/ notification: informing the user about detected situational changes or

     providing possibly useful task-related information•  Response to the user request: provide the user with the information stored in the

    smart product

    •  Acknowledge received task: confirm to the user that smart product knows its task

    •  Explain product actions: provide the reasons for product behaviour

    •  Short-term customisation: allow the user to quickly modify the product behaviour

    for the current interaction session

    •  Long-term customisation: allow the user to modify the product behaviour in a long

    term; if needed, with the help of “Ask for the user’s feedback” strategy: asking howdid the user like behaviour of the smart product

    Then we describe the Interaction Types: patterns of conveying messages to the users.

    They are characterised by their level of visibility to the users, by the type of required or

    desired user response, and by urgency of the expected user response. Then, on the

    example of “guiding” interaction strategy, we describe how user interfaces are built

    and how interaction types are selected.

    Next we present multimodal user interfaces, illustrating how different interaction

    strategy can be realised, and list main interface elements, required for their realisation.

     All interfaces, presented in this document, are parts of application prototypes, and the

    majority of them were tested in the user studies and accepted by the test subjects.

    The mock-ups were implemented in several domains: cooking, automotive, aircraft

    assembly and entertainment; some of the mock-ups were designed for large screens,

    some – for small screens and some – for both types. As interaction with the smart

     products depends on specifics of application domain and devices, the presented

    mock-ups are not aiming at instructing how interface layout should look like; instead,

    the mock-ups aim at presenting required interface elements and their functions.

  • 8/20/2019 SmartProducts -Interaction Strategies

    10/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 9

    Copyright © SmartProducts Consortium 2009-2012

    1 Introduction

    The initial version of interaction strategies and mock-ups was presented in [D5.1.2]. That time

    the mock-ups were not parts of functional application prototypes; they were just visions of howthese applications should behave and how their interfaces should look like. After that

    application prototypes were implemented and tested in several user studies. The feedback from

    the first study was used for updating the prototypes, and the updated prototypes were tested

    with the users again. Findings of the user studies confirmed the feasibility of interaction

    strategies, listed in [D5.1.2], and allowed us to add three more strategies: explanations of

     product actions, short-term customisation and long-term customisation. Consequently one of

    the initially proposed interaction strategies, “ask for the user feedback”, became part of the

    long-term customisation strategy because asking for the user feedback is needed only for

    learning of user preferences. The results of the mock-ups development and user studies

    [D5.5.1] reinforce the importance of customising the interaction to the users and their contexts

     – the aim of SmartProducts, either using the same base interface or different interfaces and

    (underlying) systems, each of which corresponds to the user and task requirements.

    Interaction strategies can often be used in a combination with each other, for example, during

    guiding it is often feasible to explain actions of the smart products, and it is feasible to provide

    users with the means to customise (in a short-term or in a long-term) the behaviour of smart

     products along with the explanations. Explanations can be also provided upon user request; in

    this case “response to the user request” gets combined with “explain product actions” strategy.

    Work on explanation strategy was done in close cooperation with the work on SmartProducts

    Monitor [D3.4.1].

    This deliverable is not aiming at exhaustive presentation of all interaction modalities and

    strategies in all possible use cases; instead, it lists main interface elements required for

    realisation of different interaction strategies, and presents implementation examples. The

    mock-ups demonstrate the difference between usage (selected parts of PRE and CRF scenarios)

    and manufacturing (selected parts of EADS scenario) stages of the smart products lifecycle – at

    the usage stage it is necessary to provide various customisation options, including options to

    satisfy preferences of several family members or friends, involved into a same task. At the

    manufacturing stage it is necessary to ensure efficiency and correctness of operations, and

     provided customisation functionality is much more limited and not aiming at satisfying

     preferences of multiple users because every aircraft assembly operator uses own device.

    The document is organised as follows. The next chapter describes the main interaction

    strategies and interaction types, and then generation of interfaces from workflows that utilise

    interaction types. Chapter 3 presents the mock-ups for each interaction strategy (the

    screenshots of applications in different domains, used for testing the realisation of the

  • 8/20/2019 SmartProducts -Interaction Strategies

    11/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 10

    Copyright © SmartProducts Consortium 2009-2012

    interaction strategy in this domain and on this device), and last chapter – conclusions. The

    mock-ups, presented in Chapter 3, do not have exactly same functionality, as they were

    developed for studying different aspects of interaction between users and smart products. For

    example, one of the mock-ups is focused on manual acquisition and learning of user

     preferences: initial manual acquisition, comparison of these initial preferences with the

    choices, made by the users in the course of interaction with the smart products, and updating

    the preferences in the course of interaction. Other mock-ups are focused on helping the users to

    achieve practical tasks and to customise the smart products, but do not necessarily employ

    learning of user preferences. These mock-ups also have different functionality, sometimes due

    to specifics of their domains (for example, mock-ups in cooking domain must allow for users’

    desire to relax or to be creative during cooking, while in aircraft assembly domain relaxation or

    extra creativity may be dangerous). Sometimes differences between the mock-ups are due to

    their purposes: for example, we present two mock-ups in automotive domain. The first mock-

    up in automotive domain is focused on realisation of interaction features, described in

    SmartProducts scenario, while another mock-up was developed for studying differences

     between user perception of various interaction features in cooking and automotive domains. It

    has exactly same UI as the corresponding cooking mock-up, because otherwise differences in

    GUI appearances could affect user opinions. However, we present here all existing mock-ups,

     because all of them are successful examples of how interaction strategies can be implemented.

  • 8/20/2019 SmartProducts -Interaction Strategies

    12/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 11

    Copyright © SmartProducts Consortium 2009-2012

    2 Interaction Strategies

    As stated in Description of Work in SmartProducts project, one focus of WP5 lies on

    “concepts how humans can interact with proactive knowledge” and their technicalimplications. Thus a twofold approach has been chosen to describe interaction strategies. The

    first concept describes a set of different meta  Interaction Strategies  (IS), like guiding or

    notifying the user. Different strategies have impact on the used components of the platform, as

    it will be described in Section 2.1. The second concept, the concept of Interaction Types (IAT),

    mainly refers to “Adaptive interaction based on proactive knowledge” problem (also stated in

    Description of Work) and will be explained in Section 2.2. Finally, the theoretical and practical

    results of how the concept of IATs are formalized and later realized in the platform will be

    shown in Section 2.4.

    These building blocks together form a conceptual approach for understanding and handling the

    interaction between users and products.

    2.1 Description of Interaction Strategies

    In [D5.1.2], the initial set of ISs has been introduced. These strategies resulted from the

    analysis of the scenario and interaction requirements and formed the scope of our architecture.

    In the following, we will provide the final set of Interaction Strategies and describe, how they

    are reflected in the SmartProducts platform.

    •  Guide the user: explain which actions the user should perform in order to achieve his/

    her goal (for example, explain how to assemble snow chains). This is the most central

    IS. Analyses of the scenarios revealed, that this strategy can be realized by procedural

    knowledge, more exactly, workflows. Thus, interactive (context-aware) workflows

    [Ständer-2011] have been becoming fundamental building blocks of the smart products.

    •  Ask the user for confirmation: ask the user if he /she has performed some action (for

    example, when it cannot be determined from context data) or offer a help and ask

    whether the user wants it (for example, ask the user whether he/ she wants the product

    to execute certain task). This is product-initiated interaction and since this type of

    interaction may be annoying, a three tier approach for automation in smart

    environments was proposed [Ständer-2010]. If a product has to recognise user actions

    or to execute a task, it should first try to do it by itself. If this is not possible, it should

    try to find related products, which could execute the task. If this also fails, the user has

    to be approached.

    •  Advice / Notify: inform about situational changes relevant to the users’ tasks or

    interest, as well as about problems: for example, the smart product can warn the user

  • 8/20/2019 SmartProducts -Interaction Strategies

    13/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 12

    Copyright © SmartProducts Consortium 2009-2012

    that the plate in the kitchen is getting too warm. This is also product-initiated

    interaction, but usually less annoying because it does not require user response.

    •  Acknowledge task : inform the user that the smart product understood what it should do

    and will perform the task. This IS is based on traditional user interface theory, which

    has shown, that users prefer immediate responses, if a program - or product, in this case

    - understood what it should do and if it will perform the task [Spiekermann-2007].

    •  Response to the user’s request: provide the user with the information stored in the

    smart product, e.g., to give more details regarding user task, or to answer when the

    coffee machine or a car was serviced last time.

    •  Default Functionality: the set of possible actions, a product can usually provide. In

    general, all these functions should be always available at any time. For example, if the

    user is approaching a coffee machine, he /she should be able to get all available types of

    coffee and tea even if the product knows that this user does not like coffee.

    •  Explanations of product actions: provide the user with the reasons for product

     behaviour, for example, to explain that interface change was caused by recognised

    event.

    •  Short-term customisation: allow the user to quickly configure certain features of

    smart products behaviour for the current interaction session, for example, to

    temporarily disable audio output

    •  Long-term customisation: allow the user to configure various features of smart

     products behaviour for the current and future interaction sessions, for example, to

    disable audio output until the user explicitly permits it. Long-term customisation may

    utilise learning, and for learning one more interaction strategy may be useful:

    o  Ask for the user’s feedback : if the user has performed a task, the smart product

    may ask her for feedback to find out whether its current behaviour is suitable or

    the user is unsatisfied. This feedback can be used for updating the user

     preferences.

  • 8/20/2019 SmartProducts -Interaction Strategies

    14/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 13

    2.2 Interaction Types

    None Simple (Yes/No) Complex

    Expected User Response (Product Input)

    Warning

    Predicate

    Request

    Notification

    Phrase

    Predicate

    PromptPrompt

    N/A

    Optional

    Predicate

    Prompt

    Optional

    Prompt

    Request

     

    Figure 1: Categories of Interaction Types from [Ständer-2010]

    While the ISs form a conceptual framework for describing different types of situations, the

    IATs are much closer to the actual interaction design. The focus of interaction of the

    SmartProducts platform does not lie on free chat applications where the user talks with the

    environment about random topics. Instead, products shall support the user in fulfilling his

    goals, which often results in guidance, which is realized by a workflow model in the

     background. However, to enable the smart product to figure out when and how to approach the

    user, the concept of IATs was introduced. As described in [D5.1.2], they have been derived in

    accordance with the interaction models in the speech act theory [Austin-2000] and

    communicative acts [FIPA-2000].For the reader’s convenience, we now provide a copy of the IATs from [D5.1.2], which has

     been slightly enhanced for a better understanding.

    This instantiation focuses on practical product-initiated computer to human interaction to be

    used in smart environments. The different types of interaction elements can be arranged

    according to the importance of the main message for the user and the expected user response as

    shown in Figure 1. Concerning the importance, we differ between interaction that can be

    omitted , that can be deferred and that cannot be deferred . The expected feedback can be split

    up into no expected feedback , simple predicate feedback (yes/no) and complex feedback . Please

    Copyright © SmartProducts Consortium 2009-2012

  • 8/20/2019 SmartProducts -Interaction Strategies

    15/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 14

    Copyright © SmartProducts Consortium 2009-2012

    note that one of our core premises is that this feedback cannot only be explicit feedback as

     provided by e.g. the user filling out visual forms or pressing buttons. Moreover, we incorporate

    implicit feedback in the form of context. In these cases the definition of simple and complex

    feedback holds as well. Example: when the user has to pick up a certain tool and the Smart

    Products platform highlights it, e.g. by letting the tool use a blinking LED, the platform can

    expect simple yes/no feedback by recognizing if the user picked the tool or not. In the

    corresponding complex case, the platform tells the user to pick up some tool of a category and

    try to figure out which tool he /she picked up.

    These types can then be used to determine the modalities for interacting with the user in the

    most suitable way. Further, it might be necessary to allow changing the type during runtime.

    The subject of a notification, which is disregarded for too long, might become a critical issue

    after some time.

    Below is a brief description for these types and examples for the case of a smart coffee maker

    (its description can be found in [Aitenbichler-2007]): 

    Phrase Phrases only convey interaction of low importance, which can also be easily ignored,

    like greetings or wishes.

    Example: “Welcome back Charly”, see Figure 8

    Optional Predicate Prompt Optional Predicate Prompts can be used to get ehavio values. If

    no response is recognized, a predefined default value will be used.

    Example: Figure 52 shows an example, where the user can select to use a sensor-augmented stirring spoon. If no reaction is received, the default behaviour is not to use it.

    Optional Prompt Optional Prompts allow more complex feedback. They also contain default

    values that are used, if no user feedback is received for some time.

    Example: One example is shown in Figure 17, where a list of users can be selected. If no

    additional user is selected, the default, in this case the recognized user ‘Lena’ will be

    used in the further progress.

    Notification The content of Notifications is related to all general information available and can

     be deferred for some time.

    Example: Figure 44 shows a notification about how to enhance the lifetime of whipers.

    While this is not important for the overall progress, it is valuable knowledge for the user.

    Predicate Prompt Predicate Prompts are deferrable and expect simple yes/no feedback.

    Example: Figure 40 shows such a case. First the question is not that urgent and can be

    deferred, if it gets more urgent, its type is changed to a  Predicate Request .

    Prompt Prompts can be used to ask for more complex feedback. This type is also deferrable.

    Example: The further process in Figure 21 depends on the type of used snow chains. Thus, it is

    important to find out, which snow chains the user wants to use. Such kind of information

  • 8/20/2019 SmartProducts -Interaction Strategies

    16/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 15

    Copyright © SmartProducts Consortium 2009-2012

    could be asked explicitly or implicitly. In such cases, the system not only needs to find

    out that the user picked up the snow chains; it also needs to sense and process the correct

    type. Thus, the expected feedback is much more complex.

    Predicate Request  Predicate Requests describe the situation where a smart product asks for

    non deferrable, simple feedback. This type differs from the Predicate Prompt only in the

    urgency.

    Example: The difference in the urgency can be communicated by using audio messages

    and by visual means, compare Figure 34 ( Details of instructing the user to place a coffee

    cup in the Cocktail Companion) and Figure 35  (Cocktail Companion actively trying to

     get feedback from the user ).

    Request As the  Predicate Request , the  Requests require timely feedback. The expected

    feedback can be more complex. This type differs from the Prompt only in the urgency.

    Example: Figure 16, shows the selection of the next task. This can be important/urgent

    for the further progress. Thus, the task selection can be realized using a request. In this

    case, the feedback consists of a concrete selected task that shall be executed next.

    Warning It is crucial that Warnings are recognized by the user as soon as possible and thus

    cannot be delayed.

    Example: Warnings can be used for different strategies, e.g. see Figure 41. Ignoring the

    question for too long might be dangerous.

    2.3 A Model for Generating Uis from Workflows

    Workflows are one way to formalize procedural knowledge about tasks which can be

     processed by or with the help of smart products. In 1999, the Workflow Management Coalition

    (WfMC) defined a workflow as “[t]he automation of a business process, in whole or part,

    during which documents, information or tasks are passed from one participant to another for

    action, according to a set of procedural rules” [WfMC-1999]. As the user study from [D.5.5.1]

    shows, users feel more comfortable while being guided, if they have an overview of the steps

    they already processed or have to process in the future. W.l.o.g., the following methods will be

    depicted with GUIs but the statements also hold for other modalities like speech interfaces. The

    easiest solution to create the required overview might be just showing all activities of the

    workflow in a big list. However, this might cause serious confusion, if the workflows describe

    a more complex task containing many activities. Another approach might be to exactly

    describe the UI to show. Since the Uis should represent the current state of the progress, e.g. by

    highlighting the currently active steps, every predefined UI would need to contain the set of

    steps and a mark, which ones have to be highlighted. This looks maybe nice during runtime but

    is not maintainable for the developers of workflows.

  • 8/20/2019 SmartProducts -Interaction Strategies

    17/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 16

    Thus, the concept of Master and Slave Uis (MUI/SUI) has been created. A MUI describes the

    scope of a UI, while the SUIs describe certain sub steps. In the SmartProducts demonstrator

    Cocktail Companion for example, the user has to select a cocktail that she wants to mix. The

    scope could in this case be the concrete cocktail, like the “Sweet Dreams”. The SUIs are then

    attached to the different steps like “adding ice” or “adding 5cl pineapple juice”. The result is

    then a screen as shown in Figure 2.

    Slave UIs

    Master UI

     

    Figure 2: Master and Slave Uis from the Cocktail Companion demonstrator

    So far, basic sequential and parallel combinations for MUIs and SUIs have been examined. If a

    MUI is followed by a sequence of SUIs, the currently active activity can highlight its

    representing UI (see Figure 3). If a MUI is followed by a parallel set of SUIs, all of them are

    active and the execution order does not matter. Thus, all SUIs are marked as active (see Figure

    4). In both cases, if the next MUI in a sequence is added, it replaces the old MUI together with

    its SUIs.

    Copyright © SmartProducts Consortium 2009-2012

  • 8/20/2019 SmartProducts -Interaction Strategies

    18/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 17

    Wait For

    Coffee

    Place Cup Add Coffee

    To Cocktail

     Add Coffee  Add Sugar 

    M

    S S S

    M

     Add Coffee

    Place Cup

    Wait for Coffee

     Add Coffee

     Add Sugar 

    Only

    One

     Active

     

    Figure 3: Sequential SUIs

    SelectTool1

    Start

    MountingSelectTools SelectTool2

    SelectTool3

    M M

    S

    S

    S

    Select Tools

    Select Tool 1

    Select Tool 2

    Select Tool 2

    Start Mounting

     All

     Active

     

    Figure 4: Parallel SUIs

    2.3.1 Displaying Uis

    The  Interaction Manager   (IAM) handles incoming and outgoing interaction as described in

    [D5.4.1, D5.4.2]. It uses the loop from Listing 1 to select Uis from the UI queue that shall be

    displayed next. In our case, a UI consists of three parts: (i) the ID of the UI to show, (ii) a

    deadline for the UI and (iii) an Interaction Type. While the ID is used by later instances in the

    UI processing chain (like the Multimodality Manager or the UI Adapter) to identify suitable

    Uis, e.g. graphical or audible Uis, the deadlines and Interaction Types are directly used by the

    Interaction Manager to select the next UI to display.Please note, that the  sets, functions and operations are explained in Section 2.4. Further, it is

    important to understand, that the Interaction Manager does only select the UI that shall be

     provided towards the user, not the exact modality. As explained in [D5.4.1] and [D5.4.2], the

    UI is only the abstract description and will be concretised by the Multimodality Manager and

    the UI Adapter. Thus, operations like display(Z)  are only responsible for selecting and

    forwarding the UI to the components, responsible for finally preparing and providing the UI.

    Copyright © SmartProducts Consortium 2009-2012

  • 8/20/2019 SmartProducts -Interaction Strategies

    19/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 18

    Copyright © SmartProducts Consortium 2009-2012

    Q={};

    D={};

    A ={};

     while (running) { 

    A

     

    // Select elements, which shall be displayer,// from the queue, if available

    if (|Q| >= 1) { 

    display(Z); 

    SLEEP UNTIL {

    a workflow activity with a UI changed its state

    OR Q changes (maybe a UI is queued from outside the

    workflows, thus, recheck what to display)

    OR registered deadline for a UI expires}

    rigger the set changes and add l required UIs to the queue// T a l

      if (workflow activity Ai is started) { 

    onActivate(Ai, Z); 

    } else if (workflow activity Ai is completed) { 

    onDe ct v i  } else if (deadline expired for UI Ui) { 

    a i ate(A , Z); 

    onDeadlineExpired(Ui, Z); 

    Listing 1: Main Processing Loop of Interaction Manager

    As long as the Interaction Manager is running, it selects the most urgent UI from the queue and

    waits for either a workflow to continue, for a UI being triggered externally or a deadline of a

    UI to expire, which also influences the real-time display order. After such a trigger occurred,

    the Interaction Manager takes the appropriate actions and rechecks which UI shall be shown

    for the new situation.

    2.3.2 MUI/SUI based Mock-Up

    As already briefly introduced, the demonstrator of the Cocktail Companion was partly created

    as a mock-up for representing and testing the MUI/SUIs. In this section, we provide a short

    description of the Cocktail Companion and provide some screenshots and images during usage.

    The purpose of the Cocktail Companion is to assist the user in executing tasks, preparing

    cocktails, in this case. The tasks are modelled as workflows, see Figure 5. When the user logs

    in, the Cocktail Companion greets the user and offers a set of cocktails. So far, every activity is

    assumed to provide a MUI, thus, fully replacing previous Uis. The user selects the desired

    cocktail and the Cocktail Companion guides her through a step-by-step preparation process.

    These step-by-step descriptions can be realized by the combination of MUIs and SUIs attached  

    to the activities  of the workflows, as depicted in Figure 5. We realized a set of different

    cocktails to make the UI more convincing.

  • 8/20/2019 SmartProducts -Interaction Strategies

    20/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 19

    Figure 5 : Cocktail Companion MUI/SUI usage

    The following figures show some of the central Uis of the Cocktail Companion, followed by

    some pictures, showing the UI during usage.

    Figure 6: Cocktail Companion MUI of the login activity

    Copyright © SmartProducts Consortium 2009-2012

  • 8/20/2019 SmartProducts -Interaction Strategies

    21/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 20

    Figure 7: The MUI of the login screen in the real setting in the Cocktail Companion

    The login UI is realized by a single MUI (of type Prompt), as shown in Figure 5 to Figure 7.

    When the workflow is activated and the “Login” activity is started, the UI for logging in is

    queued in the Interaction Manager. The IM checks which UI to display and sends a request to

    the responsible components (here, the Multimodality Manager, which selects a GUI). Once

    logged in, the workflow completes the login activity and switches to the next activity: “SelectCocktail”. This activity is also annotated with a MUI and thus, the IAM replaces the previous

    UI in the queue, which means, that the old UI is removed and the new UI is added. Figure 9 

    shows the UI of the new active activity, which is again of the type Prompt.

    Copyright © SmartProducts Consortium 2009-2012

  • 8/20/2019 SmartProducts -Interaction Strategies

    22/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 21

    Figure 8: Cocktail Companion MUI welcome screen

    Figure 9: Cocktail Companion MUI of the cocktail selection activity

    The next figures show the usage of combined MUIs/SUIs. While the previous Uis consisted of

    one description, this UI is assembled from the MUIs/SUIs. Figure 10 shows the MUI of such a

    Copyright © SmartProducts Consortium 2009-2012

  • 8/20/2019 SmartProducts -Interaction Strategies

    23/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 22

    recipe. Note, that the steps of the recipe are not listed inside the MUI itself; they will be

    generated from the different steps, the user has to process. Beneath technical details like the ID

    of the UI, which is used for internal management, the description contains data like a short

    and/or longer and detailed description or media, like pictures. Figure 11  shows a sequential

    ordering of five SUIs, providing the list of steps for making a certain cocktail. In this case, one

    activity is marked as active, having black font, while others are greyed-out.

    Figure 10: Cocktail Companion MUI for a cocktail recipe without any SUIs

    Figure 11: Cocktail Companion MUI of the cocktail preparation with the steps as SUIs

    For the current version of the Cocktail Companion, only a very limited handling of errors has

     been realized. If e.g. the user adds too much of a certain ingredient, the visualization changed

    Copyright © SmartProducts Consortium 2009-2012

  • 8/20/2019 SmartProducts -Interaction Strategies

    24/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 23

    its colour, like shown in Figure 12. Optionally, also a more obtrusive warning could be

    initiated, depending on the importance of the activity. However, more complex error handling

    would then need to be reflected in the workflow of the different recipes.

    Figure 12: Warning when too much vodkas has been added in the Cocktail Companion

    Figure 13: The Cocktail Companion SUI for measuring the amount of filled-in vodka in

    real setting

    Copyright © SmartProducts Consortium 2009-2012

  • 8/20/2019 SmartProducts -Interaction Strategies

    25/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 24

    2.4 Formal Model for Using Interaction Types

    So far, only verbal descriptions of the principles of operation of the IAM have been presented.

    The model that will be introduced in this section is the mathematical founded theory behind it

    and covers commonly used Uis, but if for example in Figure 4, the activity “SelectTool3”would consist itself of a MUI followed by some SUIs, this is not described yet in the formal

    model. Currently, this MUI would be treated like a SUI. The model describes the functionality

    of the  Interaction Manager   as a finite state machine with states  Z . The model consists of a

    definition of required sets, functions and operations, whereby operations are functions, which

    cause side effects and thus, directly manipulate Z. The basic idea is to use a queue for Uis that

    shall be shown and select the most appropriate UI during runtime. Following this approach, the

    following formulas define three kinds of rules:

    -  Rules describing conditions when Uis should be queued in the general UI-queue

    -  Rules describing when UI conditions may change, e.g. by exceeding a deadline

    -  Rules describing when and how to select Uis from the queue

    2.4.1 Definition of Sets and the States

    The following sets describe the state of workflows the interaction is based on, and the Uis that

    shall be displayed either because of active workflows or other external reasons. The interaction

    can thereby be caused by different activities, e.g. in parallel.

     A 

    Set of workflow activities as defined in [WfMC-1999]

    T   Set of workflow transitions as defined in [WfMC-1999] 

    ( ) A T ,W  =   A workflow is a tuple of A and T

    * A A⊆   Set of all activities with attached UI

     A A A⊆   Set of all activities, which are currently active

    U   Set of all possible Uis

    Q U ⊆   UI queue that contains Uis to display

     D Q⊆   Currently displayed Uis, e.g. Uis visible at some screen

     M U U ⊆   Set of all MUIs

    S U U ⊆   Set of all SUIs

    A MUI must not be a SUI

     M S U U   = ∅∩  

    Every UI is either a MUI or a SUI

     M S U U U =∪  

    ,S i S U U ⊆   Set of all SUIs of master UI

    Copyright © SmartProducts Consortium 2009-2012

  • 8/20/2019 SmartProducts -Interaction Strategies

    26/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final De ductsscription of Interaction Strategies and Mock-Up UIs for Smart Pro

     

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 25

    ( ), j ,S j i j S iU ∈U U isMUI U U U  ∈ ∧ ⇔ 

    | X 

    U U X IAT  ⊆ ∈ { , , , , , , , } IAT W PHR PR R PP P OPP OP =  Whereby IAT = InteractionType, W = Warning, PHR = Phrase,

    PR = PredicateRequest, R = Request, PP = Predicate Prompt, P = Prompt,OPP = Optional Predicate Prompt, OP = Optional Prompt

     

    { , , , , , , , }T W PHR PR R PP P OPP OP  U U U U U U U U U  = IA

     

     X 

      Set of all (master or slave) Uis with a certain IAT X

    W    Set of all Warning Uis

     PHRU    Set of all Phrase Uis

     PR

    U    Set of all Predicate Request Uis

     R  Set of all Request Uis

     PP U    Set of all Predicate Prompt Uis

     P U    Set of all Prompt Uis

    OPP 

    OP U 

    ,Y  X 

     X Y IAT X Y U U 

    U    Set of all Optional Predicate Prompt Uis

    Set of all Optional Prompt Uis

    The sets of interaction types are pair wise disjunct

    ∀ ∈ ∧ ≠ ⇒ ∩ = ∅  

    Table 1: Sets of the Interaction Model

    The interaction with workflows can thus be described as a finite state machine with states: 

    ( ), , , A  IAT  Z Q D A U =  

    2.4.2 Definition of Functions

    Functions are either mappings or assertions. They are mainly used to support the expression of

    operations, which are listed in the next section. First the functions will be declared, together

    with a short explanation, remarks and facts. Then the mathematical formula will express the

    concrete semantics.

    *:ui A U  →   Returns: the user interface of a given activity

    -  Bijektive

    -   Notation: let w.l.o.g. iU   be the UI of i A  

    ( ) iiui A U  =  

    : follows A A bool × →   Returns: true, if the first passed activity logically follows the second

    Copyright © SmartProducts Consortium 2009-2012

  • 8/20/2019 SmartProducts -Interaction Strategies

    27/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 26

    one

    -  Transitive

    -  j A  follows i A :

    i j A is required tobe finished to start A  

    -  In a sequence of activities, there can only be one activity

    active

    (   ( ) ( )),   , ,i j j i  A Ai j ji   A A follows A A follows A A A A A A∈ ∧ ∨ ∧ ∈ ⇒ ∉

    ( )

    ( )   ( )1, , 1 1 1

    .

    , . , ( , )

    ,

    ,

     

    i j

     j i i j i i j j

    t T t A A

     follows A A or A A foll  true if  

    ows A A follows A A

     false else

    + … − + −

    ⎧⎪⎨

    ⎧   ∃ ∈ =⎪

    =   ∃ ∈ ∧…⎪   ∧⎨ ⎩⎪⎩  

    :concurrent A A bool  × →   Returns: true, if the order of the two activities is not determinableduring design time-  Two activities are concurrent, if there is no sequence relation

     between them

    (   ( ) ( )( )   ), ,

    ,,

    , !

     

    i j j

    i

    i

     j

    ollows A A follows A Aconcurrent A A

     false else

    true if f  ⎧   ∨⎪= ⎨

    ⎪⎩  

    : M M 

     followingMUI U U bool × →   Returns: true, if the UI given as first parameter is a direct successormaster UI of the second UI; this means no other master UI is allowed

    in between

    ( )

    ( ) ( )   ( )( )( ),

    *

    *

    ,,

    . ,

    .

    ,

    ,

    ,

     j

    k j k 

    i j i

     j i k k M i

     A A follows A A

     followingMUI U   A A A U follows A A follows A A

     false

    true if  U  ui

    else

    ⎧   ∈⎪⎪

    = ⎨   ∧¬ ∃ ∈ ∈ ∧ ∧⎪⎪⎩  

    :S M 

    isMUI U U bool  × →   Returns: true, if the UI given as first parameter is a direct successormaster UI of the second UI; this means no other master UI is allowed

    in between

    ( )

    ( ) ( )   ( )( )( ),

    *

    *

    ,,

    .

    .

    ,

    ,

    , ,

     

    i j i

     j i k k 

     j

    k j k  M i

     A A follows A A

    isMUI U    A A A U followtrue if  

    U   s A A follows A A

    else

    ui

    ⎧   ∈⎪⎪

    = ⎨   ∧¬ ∃ ∈ ∈ ∧ ∧⎪ false⎪⎩  

    :S M 

     subUI U U bool × →   Returns: true, if the first passed UI is a slave UI of the passed MUI

    Copyright © SmartProducts Consortium 2009-2012

  • 8/20/2019 SmartProducts -Interaction Strategies

    28/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 27

    ( )( )

    ( ) ( )   ( )( )

    *

    , ,.

    *,

    . ,, ,

     j

    i j i

    k k i j k   M 

     A A follows A A

     subUI U 

    ,

     j i  A A U follows A A follows A A

     false

    true if  U  u

    els

    i

    ⎧   ∈⎪⎪

    = ⎨   ∧¬ ∃ ∈ ∈ ∧ ∧

    ⎪⎩ e⎪

     

    :activeUI U bool  →   Returns: true, if the activity of this UI is active

     A

    Copyright © SmartProducts Consortium 2009-2012

    ( ),

    ,

    i

    i

    true if A A

     falsactive

    e elsUI U 

    e

    ∈=

    ⎧⎨⎩  

    ( ):chooseDisplayUI U P U  →   Returns: the (set, since there might be slave Uis) of Uis that are of thesame IAT as the given one but should be displayed before U   

    i

    { } ( )( )

    ( )   { }   ( )   ( ){ }

    ,

    ,

    .

    , .

     

    , j i S X IAT j S X k S X k j

     j S j i M X IAT j M X k M X 

    i

    i k j

    U ifU U U U U U U U U queued U q

    chooseDisplayUI 

    el 

    ueued U 

    U U U ifU U U U U U U U U queued U queued U  

    U se

    ∈ ∩ ∧ ∃ ∈ ∩ ∧ ∀ ∈ ∩ ≤

    ∪ ∈ ∩ ∧ ∃ ∈ ∩ ∧ ∀ ∈ ∩

    ⎪= ⎨   ≤⎪⎪⎪⎩  

    Table 2: Functions of the Interaction Model

    2.4.3 Definition of Operations

    Operators are functions that map from some preimage  Z   to an image within  Z’ . They are

    manipulating any of the sets of Z, e.g. they are queuing Uis in Q. This section is structured

    similar to Section 2.4.2. First an overview of the operations, together with explanation, remarks

    and facts is given, followed by the mathematical description.

    :queueUI U Z Z  × →   Modifies Q

    ‐  iU  is added to Q (including all sub Uis, if i M U U ∈ )

    ( ), ',queueUI U Z Z w   ( )  { }

    { }( ){ }

    ,

    ,

    ,' ',

      \

    , , ',

     

    ,

    .

    i Phrase Phrase

    i Phrase j Phr  i j A

    i IAT  

    i S i

    ase

    i Phra

    i

     se

    U U Q U  

    Q if 

    U if U  here

    U QU  Z Q D A U and Q

    Q U U if  

    U U 

    Q U else

    U U 

    ⎧   ∧⎪

    ∧ ∀ ∈⎪= = =

    ∈ ≠ ∅

    ∈ ∈⎨

    ∧ ∈⎪⎪

    ∪ ∪

    i M 

     

    :U Z Z × →unqueue   Modifies Q

    ‐ i

    U  is removed from Q (including all sub Uis, if i M 

    U U ∈ )

    ( )   ( )  ({ }   ),\ ,

    , ', ' ', , , '\ ,

    i S i i M   A

    i IAT  

    i

    Q U U if U U   Z where Z Q D A U and Q

    Q U else

    ⎧   ∈⎪= = = ⎨

    ⎪⎩

    ∪unqueueUI U Z   

    :replace U U Z Z  × × →   Modifies Q‐  If a master UI is replaced by another master UI, all its sub Uis are

    removed as well

  • 8/20/2019 SmartProducts -Interaction Strategies

    29/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable ductsD.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Pro

     

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 28

    ( ) ( )( ), , ', ' , ,i j i jreplaceUI U U Z Z w queueUI U unqueueUI U Z  = =here Z  

    :onDeadlineExpired U Z Z  × →   Modifies U    IAT 

    ‐  If the deadline of a Predicate Prompt (PP) expires, it becomes aPredicate Request (PR)

    ‐  If the deadline of a Prompt (P) expires, it becomes a Request (R)‐  If the deadline of a Predicate Request (PR) expires, spawn a new

    Warning (W), pointing at the Request

    ‐  If the deadline of a Request (R) expires, spawn a new Warning (W), pointing at the Request

    ‐  If the deadline of an Optional Predicate Prompt (OPP) expires, it isremoved from the queue

    ‐  If the deadline of an Optional Prompt (OP) expires, it is removedfrom the queue

    ( )( )

    ( , , ),

    ,( ) ', ',

    ,

    ,

    ,,,

    i PP PR i PP i P  

    i PR i R

    i OPP  

     j

    i

     j

    i

    i OP 

     Z 

    queue

    move U U U if U U U U  

    if U U U U U U  onDeadlineExpir  UI U Z  ed U Z where Z Z if U U U U  

     Z els

    unqueueU U 

    e

     I Z 

    ∨∨

    ⎧   ∈ ∈⎪

    ∈ ∈ ∈⎪= = ⎨∈ ∈

    ⎪⎪

     

    : IAT IAT 

    move U U U Z Z  × × × →   Modifies U    IAT 

    ‐  Moves an element iU   from set one set of IATs to another set, thus

    changing its type

    ( ), , , , , , ' '( ) ', ' \ ' A  IAT Y i i X X Y U U Z Q D A U U U  move U Z where Z and U U  and U U  == = X IAT Y IAT i   ∈ ∈   ∪=  

    :ignore U Z Z  × →   Does not influence Z  ‐  Drop the UI without adding it to Q 

    ( ),iignore U Z Z  =  

     s :etActive A Z Z  × →   Modifies  A A‐  This method is used to add an activity to the set of active activities

    (( )   )',i I , ', ' , , , ' A A A AT i setActive A Z Z whereZ Q D A U where A A A= = =   ∪  

    :onActivate A Z Z  × →  Modifies , maybe Q 

     A

     A‐  Used, when an activity

    i A  is started

    ( )

    ( )( ) ( )( )( )( )

    *

    *

    , , , , . ,

    ' , , ,

    , ,

    i i j i j i j

    i i i

    i

     setActive A replaceUI U U Z if A A U followingMUI U U 

    e Z setActive queueUI U Z elseif A A U Q

     setActive A Z else

    ⎧   ∈ ∧ ∃⎪⎪

    = = ∈ ∧ ∉⎨⎪⎪⎩

    , ',onActivate A Z Z wher  i i A

     

     s :etInactive A Z Z  × →   Modifies  A A‐  This method is used to remove an Activity from the set of active

    activities

     A

      A

    Copyright © SmartProducts Consortium 2009-2012

  • 8/20/2019 SmartProducts -Interaction Strategies

    30/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable ductsD.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Pro

     

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 29

    ( )   ( )',i I , ', ' , , , ' \ A A A AT i setInactive A Z Z wher Q D A U where A A A=eZ = = 

    :eactivate A Z Z  × →onD   Modifies , maybe Q  A A

      Used, when an activity i A  is finished‐  If

    i M U U ∈ , the UI and its sub Uis are removed

    ( )( )

    Copyright © SmartProducts Consortium 2009-2012

    ( )( )

    *, , ,, ', '

    , ,

    i i i i M  

    i

    i

     setInactive A unqueueUI U Z if A A U U  A Z Z where Z 

     setInactive A Z else

    ⎧   ∈ ∧ ∈⎪= = ⎨

    ⎪⎩onDeactivate

     

    :display Z Z  →   Modifies D ‐  Display the most highly rated UI (preferred UI)

    ‐  Also shows sub Uis, if i M U U ∈  ‐  If there is a W instantly show it

    ‐  If there is a PR within the stack & no higher rated UI, show it

    ‐  If there is a R within the stack & no higher rated UI, show it‐  If there is a N within the stack & no higher rated UI, show it‐  If there is a PP within the stack & no higher rated UI, show it‐  If there is a P within the stack & no higher rated UI, show it‐  If there is a OPP within the stack & no higher rated UI, show it‐  If there is a OP within the stack & no higher rated UI, show it‐  If there is a PHR within the stack & no higher rated UI, show it

    ( )

    ( )   ( )

    ( )

    ( )

    ( )

    ,

    ,

    ,

    ,

    ', ' , ', ,

    .

    '

    .

    .

    i

    i

    i

    i

     A

    i W 

    i PR j j W  

    i R j j W PR

    i N j j W PR

     IAT 

     R

    U Q

    U Q U U Q

    chooseDisplayUI U if U  

    chooseDisplayUI U if  

    chooseDisplayUI U if  

    chooseDisplayUI U if  

    display Z Z where Z Q D A U and D

    U U 

    U Q U U QU U U  

    U Q U U Q U U  

    ch

    U U 

    ∃ ∈

    ∈ ∩ ∧ ∀ ∈ ∉

    ∪∃ ∈ ∧ ∀ ∈ ∉

    ∃ ∈ ∧ ∀ ∪ ∪

    =

    = =

    ( )

    ( )

    ( )

    ( )

    , .

    .

    .

    ,

    .,

    ,

    i PP j j W PR R N  

    i P j j W PR R N PP  

    i OPP j j W PR R N PP P  

    i OP j

    i

     j

    i

    i

    i

    U Q U U QU U U U U  

    U Q U U QU U U U U U  

    ooseDisplayUI U if  

    chooseDisplayUI U if  

    chooseDisplayUI U U Q U U QU U U U U U U  

    if 

    chooseDisplay Q U U QU if U  UI 

    ∃ ∈ ∧ ∀ ∈ ∉

    ∃ ∈ ∧ ∀ ∈ ∉

    ∃ ∈ ∧ ∀ ∈ ∉

    ∪ ∪ ∪

    ∈ ∧ ∀

    ∪ ∪ ∪

    ∪ ∪ ∪ ∪ ∪

    ( ) .,

    W PR R N PP P OPP  

    i j j W PR R N PP P  i P PP OP   H O RchooseDisplay

    U U U U U U U  

    U Q U U QUI U i U U U U U U U U U   f 

    ∃ ∈ ∧

    ⎧⎪⎪⎪⎪⎪⎪⎪⎨

    ⎪⎪⎪⎪⎪ ∪ ∪ ∪ ∪ ∪ ∪⎪⎪   ∪ ∪ ∪ ∪ ∪ ∪∉ ∪∀ ∈⎩   ∩

     

    Table 3: Operations of the Interaction Model

  • 8/20/2019 SmartProducts -Interaction Strategies

    31/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 30

    Copyright © SmartProducts Consortium 2009-2012

    3 Mock-Up Uis

    This section presents mock-up Uis, developed in the SmartProducts project for testing various

    interaction strategies and types in different application domains: cooking (guiding throughrecipe preparation steps), automotive (guiding through mounting snow chains steps and wiper

    changing procedure steps), aircraft assembly and entertainment (guiding through origami

    folding steps) domains. Origami is an art of folding beautiful figures from paper, and guiding

    for origami resembles guiding for smart products scenarios. Origami application was

    developed for research purposes because arranging user tests in realistic origami scenario is

    easier than arranging user tests in realistic car servicing scenario. Origami application was

    found very useful for studying various generic interaction problems, despite entertainment

    domain is not present in SmartProduct project scenarios.

    The tested input modalities of the mock-ups included various sensors (accelerometer, RFID,

    audio, camera) and GUI, and output modalities include audio and GUI (text and images/

    videos). Workpackages 2 and 3 also contributed to mock-ups development: first of all, all

     presented mock-ups utilise ontologies, developed in cooperation with WP2; second, question

    answering tool, developed in WP3, and SmartPoducts Monitor, also developed in WP3,

    directly implement certain interaction strategies.

    Implementation of the mock-ups depends on application domain and on device, but all of them

    contain elements for enabling SmartProducts interaction strategies, although not necessarily for

    enabling all strategies. The presented screenshots often illustrate two or more strategies,

     because for example it is natural to combine “explain user actions” strategy with short-term

    and/ long-term customisation strategy: users may get very annoyed if they are not allowed to

    correct smart products behaviour after they understand the reasons for it. Often different

    implementations of one strategy are presented, for example, various guiding options, because

    implementation details depend on application domain and device. We do not aim at comparing

    various interfaces; we only aim at pointing out main interface elements, required for enabling

    each interaction strategy. However, we would like to point out that the majority of the

     presented mock-ups was tested in several user studies [D5.2.1.Annex, D5.5.1, Vildjiounaite-

    2011] and accepted by the test subjects. The preliminary mock-ups for aircraft assembly, that

    are slightly different from that presented in this document, were tested recently in the user

    study at EADS. The study confirmed the feasibility of the planned interfaces, presented in this

    document. The Cocktail Companion was presented to several subjects in a course of

    development and at the ICT 2010 [ICT] and iTEC 2010 [iTEC] events.

  • 8/20/2019 SmartProducts -Interaction Strategies

    32/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 31

    3.1 “Default functionality” strategy

    Default functionality is the functionality which should be always possible to access, and to do

    it fairly quickly. The absolutely minimal default functionality is task selection. As smart

     products provide user-adapted interaction, default functionality includes also user login.Depending on device size and domain-dependent probability of how frequently the default

    functionality may be needed, these elements may be included in the main application screen or

    accessed by activating a separate screen.

    Figure 14 illustrates, that non-personalised task selection in origami application is available via

    main menu, which opens a separate screen with the list of tasks and their descriptions,

     presented in Figure 15. Main menu in origami application provides also personalised task

    selection option; its menu opens if “Get Recommendation” menu item is selected or

    corresponding shortcut key is pressed (Each menu item has available shortcut key (using the

    Alt-Key mask) and a mnemonic, allowing keyboard interaction in addition to the use of a

    mouse. Disabled menu options are indicated using grey-out .)

    Figure 14: The Menu showing the sub-menu for browsing and searching the origami

    folds database

    Copyright © SmartProducts Consortium 2009-2012

  • 8/20/2019 SmartProducts -Interaction Strategies

    33/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 32

    Figure 15: Task selection in origami application for a large screen

    Personalised task selection is enabled only after user login. Figure 51 and Figure 71 present

     personalised task suggestions for different user preferences. Figure 16 presents an example of

     personalised task selection in the cooking assistant (i.e., selection of favourite recipes), which

    does not require opening a separate screen, but presents only names of frequently cooked

    recipes. Selection of one of favourite recipes in the large screen of the GUI can be performed

    fairly quickly by pressing “select task” button, which opens recipe list. Selection of a recipe

    which is not so well-known would require providing at the same time more information about

    this recipe. This may be achieved by providing additional information upon a click on a recipe

    name, or by displaying most important information about recipe immediately, as it is done

    in Figure 15 for origami selection task. Naturally what is important to display is different for

    recipe and for origami fold selection, for example, time is more important for recipe selection

    than for origami fold selection: even a person in a hurry may want to cook and eat, but a person

    in a hurry will hardly start folding origami.

    Copyright © SmartProducts Consortium 2009-2012

  • 8/20/2019 SmartProducts -Interaction Strategies

    34/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 33

    Figure 16: Task selection in the cooking assistant for a large screen

    Figure 16  also shows that user login in the laptop-based version of the cooking assistant is

    available via the main application screen, because new family members or guests may join

    cooking process any time, and because screen is large enough. User login is performed via

    selecting users’ names from the list (when multiple users login simultaneously, all their names

    are highlighted in the list). Figure 17  shows (on the example of a car servicing application,

    running on N900 mobile phone) how GUI-based login is implemented in the small screen of

    the phone: the user list is opened in a separate window after the button with the user name(s)

    was pressed. Task selection for a small screen would also require opening another window, due

    to small screen size. In aircraft assembly mock-up task selection is also done in a separate

    window via main menu, and Figure 25 shows that a button, allowing getting to this main menu

    (button (1)), is always visible.

    Figure 6 and Figure 7 show the sensor-based login and Figure 9 presents a task selection screen

    in the Cocktail Companion mock-up. The user login in the Cocktail Companion utilises a

    SmartProducts Authentication Device (AD): an RFID reader for chipcards. The Cocktail

    Companion does not allow skipping the login step, since non-personalised task selection is

    infeasible: the list of cocktails, which is displayed in the next step (see Figure 9), depends not

    only on the available ingredients, but also on the identified user. If the user is under 18, she

    would not get cocktail recipes with alcohol, for example.

    Copyright © SmartProducts Consortium 2009-2012

  • 8/20/2019 SmartProducts -Interaction Strategies

    35/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 34

    Figure 17: User login for a small screen in the car assistant

    3.2 “Guide the user” strategy

    This strategy is the core of most of application scenarios. Guiding through a task in the mock-

    ups requires interaction elements for:

    •  showing overview of the task steps

    •  step-by-step presentation of instructions

    •  navigation between the steps

    •  recognition of user actions

    Elements for recognition of user actions are optional as they depend on availability and type of

    sensors; presenting an overview of the task steps is convenient for the users, but may be

     presented in a separate screen, depending on complexity of instructions and a device. Steps

    overview may be also skipped: for example, in aircraft assembly correctness of procedures is

    very important, while allowing operators to freely navigate between steps may be dangerous.

    Copyright © SmartProducts Consortium 2009-2012

  • 8/20/2019 SmartProducts -Interaction Strategies

    36/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 35

    In the cooking domain guiding through recipe preparation steps is the common case. Figure 18 

     presents a guiding mock-up for the large screen (laptop) for “Halloween sausages” recipe: the

    steps overview is presented in the right part; instructions for the current step are presented to

    the left from the steps overview; navigation between the steps can be done via pressing “next”

    and “previous” buttons or by selecting a step in the overview list. Additionally, “next” and

    “back” speech commands can be used for the navigation between the steps, and sensor-based

    recognition of user actions can be used for triggering transition to the next step. Controls in the

    lower part of the GUI will be explained below, while presenting “short-term customisation”

    strategies. The guiding mock-ups of the Cocktail Companion are presented in the previous

    section.

    Figure 18: Recipe guiding in a large screen in the cooking assistant

    As users generally appreciate polite behaviour of computers and as it useless to instruct the

    users to do certain step if the users are currently distracted from the task by a conversation

     between each other or a phone call, the mock-ups were made capable of delaying audio

    instructions until a break of a conversation would be detected (for more details, see

    D5.2.1.Annex [D5.2.1.Annex] and the paper [Vildjiounaite-2011]). Interaction via GUI looks

    same as above.

    The guiding mock-up for a smaller screen in cooking domain (for Nokia N900 phone) is

     presented in Figure 19. Mobile version of the cooking assistant was developed because humans

    may cook also outside own homes, for example, in a courtyard or while travelling.

    Copyright © SmartProducts Consortium 2009-2012

  • 8/20/2019 SmartProducts -Interaction Strategies

    37/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 36

    Due to smaller screen size, only “next” button was placed in the GUI, while backwards

    navigation is possible only via selecting a step in the overview list. Other controls will be

    explained while presenting “short-term customisation” strategy.

    Figure 19: Recipe guiding in a small screen in the cooking assistant

    Figure 20  presents the mock-up of the guiding strategy in the automotive domain for the

    “mounting snow chains” procedure.

    Figure 20: Guiding for a vehicle component mounting in automotive domain

    The figure presents the main interface to the user, with capabilities to:

    Copyright © SmartProducts Consortium 2009-2012

  • 8/20/2019 SmartProducts -Interaction Strategies

    38/92

    SmartProducts WP5 – Multimodal User Interfaces and Context-Aware User Interaction

    Deliverable D.5.1.3: Final Description of Interaction Strategies and Mock-Up UIs for Smart Products

    SmartProducts_D_5_1_3.doc Dissemination Level: Public Page 37

    •  switch from proactive to manual mode: in the manual mode, the system awaits for user

    instructions, whether the pressing of a button or a vocal command, while in the

     proactive mode it understands the curr