direct haptics in massive virtual reality experiences...

118
For Jury Evaluation FACULDADE DE E NGENHARIA DA UNIVERSIDADE DO P ORTO Direct Haptics in MASSIVE Virtual Reality Experiences Cristiano Ramos Carvalheiro Mestrado Integrado em Engenharia Informática e Computação Supervisor: Rui Pedro Amaral Rodrigues Co-supervisor: Hugo Machado da Silva January 28, 2016

Upload: others

Post on 25-Jan-2021

4 views

Category:

Documents


0 download

TRANSCRIPT

  • For J

    uryEv

    aluati

    on

    FACULDADE DE ENGENHARIA DA UNIVERSIDADE DO PORTO

    Direct Haptics in MASSIVE VirtualReality Experiences

    Cristiano Ramos Carvalheiro

    Mestrado Integrado em Engenharia Informática e Computação

    Supervisor: Rui Pedro Amaral Rodrigues

    Co-supervisor: Hugo Machado da Silva

    January 28, 2016

  • Direct Haptics in MASSIVE Virtual Reality Experiences

    Cristiano Ramos Carvalheiro

    Mestrado Integrado em Engenharia Informática e Computação

    To be approved in oral examination by the committee:

    Chair: Nuno Honório Rodrigues Flores

    External Examiner: Pedro Miguel do Vale Moreira

    Supervisor: Rui Pedro Amaral Rodrigues

    January 28, 2016

  • Abstract

    Virtual Reality (VR) is a technology that has been gradually developed since the early on the lastcentury, where people can interact in a multi-sensorial immersive experience with a synthetizedthree-dimensional Virtual Environment. Although the first applications were simple or did notwork very well due to several hardware limitations, lots of new technologies and techniques havebeen researched and developed since then. In the last decade, due to the significant evolution ofthe hardware and software, a new rise of interest in VR has been experienced. Nowadays, there areseveral companies researching and devel-oping devices dedicated to virtual reality experiences.

    While devices that cover our visual and auditory senses are relatively well developed, hard-ware that focuses on the remaining three senses (haptic, olfactory and gustatory) is in its childhood.Only recently, there has been a rise of interest in researching technologies that provide solutionsto handle those senses.

    Over the years, there have been different approaches to solve the problem of providing hapticfeedback from virtual objects. These attempts were mostly based on devices like gloves or surfacesthat resorted at different stimulation techniques in order to provide the user with the sense of touchas well as to limit his hand or fingers movement.

    These devices, however, face significant limitations, as the current understanding of the humanhaptic perception is quite limited. Hence, the need for alternatives to these devices is identified.The same way, there is a lack of solutions capable of providing haptic feedback to the users in atransparent way when they touch virtual objects.

    Related research confirms that the use of real objects in virtual environments affects positivelyVR experiences. Therefore, this dissertation addresses those problems by proposing a solutionfor developing an interaction framework based on direct haptics, where real objects are used toprovide haptic feedback to the users. Thus, the solution will be capable of redirecting the user tothe object that he intends to touch and, when in contact with virtual objects, the haptic feedbackwill be delivered from equivalent real objects, without the user realizing the manipulation or losinghis sense of presence.

    The solution was specified in terms of functionality and an approach to implement and testit was followed. A generic architecture is described so that the guidelines for implementing thesolution in future work can be followed. In addition, the implementations details of the solutionare described and discussed.

    A functional prototype, implementing a subset of features specified was developed and vali-dated through user testing sessions. The insights gathered from these tests established guidelinesfor future work on this solution. It is expected that future iterations of the work here initiated aredone and the contributions of this work can help and inspire researchers and enthusiasts on thisarea.

    This dissertation addresses the Human-Computer Interaction (HCI) area, under the followingtopics: Haptic Feedback, Virtual Reality, Mixed/Augmented Reality and Interaction Design.

    i

  • ii

  • Resumo

    A realidade virtual é uma tecnologia que tem vindo a ser desenvolvida gradualemente desde oinício do século passado, onde se pode interagir, numa experiência multi-sensorial e imersiva, comum ambiente virtual tridimensional sintetizado. Apesar das primeiras aplicações serem simples oupouco funcionais devido a grandes limitações a nível do hardware, novas tecnologias e técnicasforam investigadas e desenvolvidas desde então. Na última década, devido a uma grande evoluçãoa nível de hardware e software, houve um crescimento do interesse na realidade virtual. Hoje emdia, existem várias empresas a investigar e desenvolver dispositios dedicados a experiências derealidade virtual.

    Enquanto dispositivos que suportam os sentidos visual e auditivo estão relativamente bemdesenvolvidos, hardware que suporte os restantes três sentidos (hápticos, olfativo and gustativo)estão ainda na sua infância. Apenas recentemente, houve um surto de interesse na investigação detecnologias que provideciem soluções para tratar esses sentidos.

    Ao longo dos anos tem havido diferentes formas de resolver o problema de providenciar feed-back háptico a partir de objetos virtuais. A maioria destas tentativas são baseadas em dispositivoscomo luvas de realidade virtual or superfícies que assentam em diferentes técnicas de estimulaçãode forma a providenciar ao utilizador a sensação de toque ou limitar o movimento das suas mãosou dedos.

    Contudo, estes dispositivos apresentam limitações significativas, uma vez que o atual conhec-imento da perceção háptica humana é ainda limitado. Assim, a necessidade de uma alternativa aestes dispositivos é identificada. Da mesma forma, existe uma necessidade de soluções capazes deprovidenciar feedback háptico aos utilizadores de forma transparente, aquando do toque de objetosvirtuais.

    Investigação relacionada confirma que o uso de objetos reais em ambientes virtuais afeta pos-itivamente as experiências de realidade virtual. Assim, esta dissertação aborda os problemas pro-pondo o desenvolvimento de uma framework de interação baseada em háptica direta, onde objetosreais são usados como forma de providenciar feedback háptico aos utilizadores. Desta forma,a solução será capaz de redirecionar o utilizador para o objeto que este pretende tocar e, umavez em contacto com os objetos virtuais, o feedback háptico será transmitido a partir dos objetosreais equivalentes, sem que o utilizador se aperceba da manipulação ou perca a sua sensação depresença.

    A solução foi especificada em termos de funcionalidade e uma abordagem para a implementare testar foi seguida. Uma arquitetura genérica é descrita de forma a que as orientações para im-plementar a solução possam ser seguidas no trabalho a realizar no futuro. Também, os detalhes deimplementação da solução são descrito e discutidos.

    Um protótipo funcional, que implementa um subconjunto das funcionalidades especificadasfoi desenvolvido e validado através de sessões de teste com utilizadores. As impressões obtidasnestes testes estabeleceram orientações para o trabalho a realizar no futuro. É esperado que sejam

    iii

  • feitas futuras iterações de desenvolvimento do trabalho aqui iniciado e que as contribuições domesmo possam ajudar e inspirar os investigadores e entusiastas nesta área.

    Esta dissertação aborda a área Interação Pessoa-Computador nos seguintes tópicos: Feedbackháptico, Realidade Virtual, Realidade Aumentada/Mista e Design de Interação.

    iv

  • Acknowledgements

    I would like to acknowledge everyone who has contributed to this dissertation. However, one pageis not enough to acknowledge all of those that deserve it. I know they will understand.

    First, I would like to thank my supervisors, Prof. Rui Rodrigues e Hugo Machado, becausewithout them this dissertation would not exist. The initial idea of using Direct Haptics in VEscame from them in the context of the MASSIVE project. I want to tank them for being alwayspatient, guiding me when I wanted to make more and more, despite being always running late.Their contributions are unquestionable and I want to thank them for exposing me to this field. Ialso want to thank all the volunteers for testing the prototype developed in this work, which helpedenrich it.

    The final words will be written in my native language, for those that shaped what I and thisdissertation have become. Em primeiro lugar, queria agradecer à mulher da minha vida, Adriana,pelo apoio incontestável em cada momento do curso e especialmente nesta última fase, uma fontede inspiração para seguir em frente e um exemplo a seguir difícil de igualar. À minha família, pordesde a minha infância me terem educado e apoiado sempre, e que consequentemente me fizeramser a pessoa que sou hoje. Em especial à minha mãe, Regina, pela educação, e por todo o sacrifícioque passou de forma a que eu pudesse seguir os meus sonhos.

    Um grande abraço, aos amigos da minha terra natal que me acompanharam até ao 12o ano – eque me continuam a acompanhar fora da faculdade. Ao Luís, meu padrinho, pelo enorme apoio epela eterna amizade. Ao Manuel, pela inteligência e boa disposição. Ao Pedro, pela boa disposiçãoe maneira de ser. A estes e ao Ricardo, Diogo, Aires, Ruben e David pelas incontáveis horas quepartilhámos, fosse a jogar ou a conviver. Ao Guilherme por todas a conversas filosóficas e pela boadisposição e excelente personalidade. Ao André, pelo empenho e aplicação, um exemplo a seguir.Ao Xico, pelo seu empenho e aplicação em tudo o que faz, sempre com boa disposição e pronto aajudar, um verdadeiro amigo e uma fonte de inspiração. Ao Filinto, Ivo e Joel e aos restantes porme acolherem sempre com entusiasmo e boa disposição.

    Finalmente, umas palavras para aqueles que me acompanharam durante o curso e que memantiveram motivado ao longo de 5 anos. Ao Luís, por todas as boleias e pela boa disposiçãopara me aturar durante 5 anos, um verdadeiro irmão. A ele e à Carla, ao Raunan, ao Tiago e aoMiguel, com os quais partilhei a casa ao longo de 5 anos, que sempre foram capazes de manterum ambiente animado no dia-a-dia. Ao Pedro, pela inteligência e boa disposição contagiável.Ao outro pedro, pela capacidade de organização e motivar os outros, sempre pronto para ajudar,uma verdadeira fonte de inspiração. Ao David, pela sua amizade e boa disposição. Ao Rui e aoVítor pelas excelentes noitadas a trabalhar, cheias de boa disposição que fizeram daquele, um dosmelhores semestres ao longo deste curso. Aos dois Ruis e ao Vasco pelas horas que me ouvirama falar da minha tese. Ao Luís, Daniel, Henrique, Luís, Diogo e João pelas excelentes horas deconvívio cheias de boa disposição. Aos restantes, agradeço por terem convivido comigo.

    Cristiano Carvalheiro

    v

  • vi

  • “Any sufficiently advanced technology is indistinguishable from magic.”

    Arthur Charles Clarke

    vii

  • viii

  • Contents

    1 Introduction 11.1 Problem Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.2 Goals and Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.3 Document Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

    2 Review on Haptics in Virtual Reality 52.1 Virtual Reality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

    2.1.1 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62.1.2 Reality versus Virtuality . . . . . . . . . . . . . . . . . . . . . . . . . . 72.1.3 Current State . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72.1.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

    2.2 Haptic Human-Computer Interaction . . . . . . . . . . . . . . . . . . . . . . . . 82.2.1 Haptic Interaction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82.2.2 Haptic Interfaces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92.2.3 Limitations of Haptic Interfaces . . . . . . . . . . . . . . . . . . . . . . 102.2.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

    2.3 Exploration and Interaction in Virtual Environments . . . . . . . . . . . . . . . . 112.3.1 Redirected Walking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112.3.2 Redirected Touching . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122.3.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

    2.4 Interaction Design in Virtual Reality . . . . . . . . . . . . . . . . . . . . . . . . 142.4.1 Interaction and Manipulation Design Guidelines . . . . . . . . . . . . . 142.4.2 User-Centered Design . . . . . . . . . . . . . . . . . . . . . . . . . . . 152.4.3 Testbed Evaluation for Interaction Techniques . . . . . . . . . . . . . . . 162.4.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

    3 Solution Specification – Direct Haptics Framework 193.1 Solution Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

    3.1.1 Architecture Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . 203.1.2 Redirection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213.1.3 Hand Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 223.1.4 Object Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

    3.2 Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 233.3 Functionality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

    3.3.1 Functional Prototype . . . . . . . . . . . . . . . . . . . . . . . . . . . . 243.4 Hardware Requirements and Setup . . . . . . . . . . . . . . . . . . . . . . . . . 253.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

    ix

  • CONTENTS

    4 Solution Architecture 274.1 Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 284.2 Objects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 294.3 Redirection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 294.4 Object Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 304.5 Hand Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 314.6 Object Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 324.7 Hand Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 324.8 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

    5 Implementation 355.1 Technologies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

    5.1.1 Simulation Engine . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 355.1.2 Hardware Devices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

    5.2 Functional Prototype . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 375.2.1 Functionality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 375.2.2 Hardware Setup Overview . . . . . . . . . . . . . . . . . . . . . . . . . 385.2.3 Scene Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

    5.3 Implementation Details . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 395.3.1 Implementation Overview . . . . . . . . . . . . . . . . . . . . . . . . . 405.3.2 Redirection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 415.3.3 Object Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 435.3.4 Hand Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 495.3.5 System Calibration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 515.3.6 Test System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 525.3.7 Log System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

    5.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

    6 Tests and Validation of the Results 556.1 Early Test Sessions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 556.2 Final Test Sessions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

    6.2.1 Testing Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 566.2.2 Users Profile . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 576.2.3 Task and Taxonomy . . . . . . . . . . . . . . . . . . . . . . . . . . . . 586.2.4 Outside Factors and Performance Metrics . . . . . . . . . . . . . . . . . 59

    6.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 606.4 Summary and Discussion of the Results . . . . . . . . . . . . . . . . . . . . . . 62

    7 Conclusions and Future Perspectives 697.1 Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 697.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 707.3 Final Remarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71

    A Solution Features 73A.1 Hand Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73

    A.1.1 Space Calibration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73A.1.2 Hand Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73A.1.3 Hand Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74A.1.4 Hand Representation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74

    x

  • CONTENTS

    A.1.5 Hand Detect Touch Interaction . . . . . . . . . . . . . . . . . . . . . . . 74A.1.6 Hand Pinch Touch Interaction . . . . . . . . . . . . . . . . . . . . . . . 74A.1.7 Hand Grab Touch Interaction . . . . . . . . . . . . . . . . . . . . . . . . 74A.1.8 Modify Attributes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75

    A.2 Object Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75A.2.1 Space Calibration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75A.2.2 Object Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75A.2.3 Object Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75A.2.4 Object Representation . . . . . . . . . . . . . . . . . . . . . . . . . . . 75A.2.5 Support for Multiple Objects . . . . . . . . . . . . . . . . . . . . . . . . 76A.2.6 Modify Attributes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76

    A.3 Redirection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76A.3.1 Create 1D Pattern . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76A.3.2 Create 2D Pattern . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76A.3.3 Create 3D Pattern . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77A.3.4 Modify Pattern Attributes . . . . . . . . . . . . . . . . . . . . . . . . . 77A.3.5 Remove Pattern . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77

    B Testing Sessions Assets 79B.1 Documents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79

    B.1.1 Experience Information . . . . . . . . . . . . . . . . . . . . . . . . . . 79B.1.2 Informed Consent . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80

    B.2 Session Script . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81B.2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81B.2.2 First Phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82B.2.3 Purpose of the Experiment . . . . . . . . . . . . . . . . . . . . . . . . . 82B.2.4 Second Phase . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82B.2.5 Questionnaire Completion . . . . . . . . . . . . . . . . . . . . . . . . . 83

    B.3 Questionnaire . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83B.3.1 Simulator Sickness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83B.3.2 Spacial Awareness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84B.3.3 Personal Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84

    C Log File Template 87

    References 91

    xi

  • CONTENTS

    xii

  • List of Figures

    2.1 Definition of Mixed Reality within the context of the Reality-Virtuality Contin-uum [MC99] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

    2.2 Basic high-level architecture for a virtual reality application incorporating visual,auditory and haptic feedback [SCB04] . . . . . . . . . . . . . . . . . . . . . . . 9

    2.3 The CyberGrasp glove to the left and the Phantom device to the right [Bur00,HACH+04] . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

    2.4 Difference between the virtual path (above) and real path (below) of the user usingRedirected Walking [RKW01] . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

    2.5 Difference between the virtual finger location (corner) and the real location (inred) after warping space [Koh10] . . . . . . . . . . . . . . . . . . . . . . . . . . 13

    2.6 Evaluation approach in testbed evaluation [BJH+01] . . . . . . . . . . . . . . . 16

    3.1 A user receiving haptic feedback directly from a sphere (right) coherent with whathe sees in the VE (left) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

    3.2 Updated diagram of the architecture of the solution . . . . . . . . . . . . . . . . 213.3 Example of mapping for a non-linear two-dimensional distortion pattern . . . . . 213.4 Development cycle of a milestone . . . . . . . . . . . . . . . . . . . . . . . . . 233.5 Possible physical setup of the hardware . . . . . . . . . . . . . . . . . . . . . . 25

    4.1 Domain model overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 274.2 High-level architecture overview of the Framework . . . . . . . . . . . . . . . . 284.3 Objects – Domain model overview . . . . . . . . . . . . . . . . . . . . . . . . . 294.4 Redirection – Domain model overview . . . . . . . . . . . . . . . . . . . . . . . 304.5 Object Model – Domain model overview . . . . . . . . . . . . . . . . . . . . . . 304.6 Hand Model – Domain model overview . . . . . . . . . . . . . . . . . . . . . . 314.7 Object Tracking – Domain model overview . . . . . . . . . . . . . . . . . . . . 324.8 Hand Tracking – Domain model overview . . . . . . . . . . . . . . . . . . . . . 33

    5.1 Hardware setup overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 385.2 Scene setup overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 395.3 Overview of the solution implementation and integration with Unity 3D . . . . . 405.4 RedirectionManager - Implementation overview . . . . . . . . . . . . . . . . . . 415.5 Redirection implementation overview . . . . . . . . . . . . . . . . . . . . . . . 425.6 Object Tracking implementation overview . . . . . . . . . . . . . . . . . . . . . 435.7 OTC calibration – View of the markers windows . . . . . . . . . . . . . . . . . . 475.8 Hand Tracking implementation overview . . . . . . . . . . . . . . . . . . . . . . 495.9 Hand interacting with the sphere in a pinch pose . . . . . . . . . . . . . . . . . . 515.10 Test System – Domain model overview . . . . . . . . . . . . . . . . . . . . . . 525.11 Log Data – Domain model overview . . . . . . . . . . . . . . . . . . . . . . . . 53

    xiii

  • LIST OF FIGURES

    6.1 User profile distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 576.2 Taxonomy of touch and move interaction techniques . . . . . . . . . . . . . . . 606.3 Insights on average task execution time per task (as in default distortion level

    sequence) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 626.4 Insights on average task execution accuracy per task . . . . . . . . . . . . . . . . 636.5 Insights on average task execution accuracy per user . . . . . . . . . . . . . . . . 646.6 Insights on how distortion levels affects the accuracy . . . . . . . . . . . . . . . 656.7 Insights on user’s detection of manipulation . . . . . . . . . . . . . . . . . . . . 656.8 Insights on how users felt using the system . . . . . . . . . . . . . . . . . . . . . 656.9 Insights on how calibration quality affects accuracy . . . . . . . . . . . . . . . . 66

    xiv

  • List of Tables

    3.1 List of features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

    5.1 List of features implemented in the functional prototype . . . . . . . . . . . . . . 37

    6.1 Sequences of distortion levels used in testing sessions . . . . . . . . . . . . . . . 596.2 Insights on Simulator Sickness scores of the system . . . . . . . . . . . . . . . . 666.3 Insights on user’s distortion classification during second phase . . . . . . . . . . 666.4 Frequency of subjective comments made by users . . . . . . . . . . . . . . . . . 67

    C.1 CSV log file template . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89

    xv

  • LIST OF TABLES

    xvi

  • Abbreviations

    AR Augmented RealityFEUP Faculdade de Engenharia da Universidade do PortoHCI Human-Computer InteractionHTC Hand Tracking ControllerHTD Hand Tracking DeviceHMD Head-Mounted DisplayMASSIVE Multimodal Acknowledgeable multiSenSorial Immersive Virtual EnvironmentOTC Object Tracking ControllerOTD Object Tracking DeviceSDK Software Development KitSSQ Simulator Sickness QuestionnaireUCD User-Centered DesignVE Virtual EnvironmentVR Virtual Reality

    xvii

  • Chapter 1

    Introduction2

    Virtual Reality (VR) has been an active area of research in Computer Graphics and Systems ac-4

    knowledged since as long as the publication of Sutherland’s paper [Sut65], in 1965. This paper

    presented a VR system that provided immersion and interaction to all human senses [ZD09].6

    Computers and everyday technology have evolved in a steep rate in the last decades, allowing

    the evolution of the hardware and software to standards that made possible the appearance of8

    the first VR applications. Alongside with the evolution of the technology, there has been a rise

    of interest in this subject, leading to a significant investment in the research and development of10

    devices that support VR experiences [ZD09].

    Today we see VR applications in a variety of fields. All of those applications have a common12

    concept, the virtual experience. Even if this virtual experience has evolved significantly in the near

    past, there is yet a need for more immersive and interactive systems.14

    Today, the existing technologies in the VR field are relatively well developed when referring

    to devices that support the visual and auditory senses. However, devices that support feedback16

    to other relevant senses – taste, smell and touch – are in their infancy. A VR experience that

    has a good sense of immersion is one in which the system supporting it provides feedback to the18

    majority of the human senses.

    Adding support to haptic feedback can enhance the sense of immersion in a Virtual Environ-20

    ment (VE) [Ins01]. In fact, without haptic feedback it is difficult to interact precisely with objects,

    as the user cannot feel the object or the movement restrictions that it imposes [MBJS97].22

    However, as virtual objects do not exist in the real world, they do not provide haptic feedback

    naturally. In the last decades, there have been attempts to solve this problem based on devices24

    that recreate the sense of touch or restrict the user movement – active haptics. Most of these

    devices are VR gloves or tables that provide haptic feedback. However, most of these solutions26

    end up being somewhat limited, expensive or complex [SB97]. A different approach to solve this

    problem is to use real objects to provide haptic feedback to the users – passive haptics. Passive28

    haptics can increase the user’s sense of presence as well as his spatial memory of the VE [Ins01].

    1

  • Introduction

    Thus, redirecting the user to real objects that are mapped to virtual ones in a way that when the

    user touches a virtual object, he touches simultaneously the equivalent real one, may be a solution 2

    to transparently provide reliable haptic feedback.

    Other techniques like Redirected Walking [RKW01] and Redirected Touching [Koh10] can 4

    also be implemented, adding the possibility to construct large VEs when the real spatial availability

    is relatively small. Therefore, it can be a great advantage in a variety of fields that make use of VR 6

    applications.

    Letting users interact with real objects directly with their hands – direct haptics [Ker09] – to 8

    provide haptic feedback in VR experiences has the potential to be a good candidate solution to

    make these experiences even more immersive and increase the sense of presence of users during 10

    a VR experience. In addition, as this is a recent field of research, any advancement done in this

    dissertation will be a good contribution to the area. 12

    In this dissertation, we will develop a solution to support direct haptic feedback in VEs that,

    upon finished and validated, will be integrated in MASSIVE (Multimodal Acknowledgeable mul- 14

    tiSenSorial Immersive Virtual Environment) project. The MASSIVE project aims to find a way

    of integration of all human senses in VEs, making immersive virtual experiences as perceptually 16

    equivalent to the real ones as possible.

    In Section 1.1 the problems addressed are identified. The goals aimed to be achieved by the 18

    work done in this dissertation are listed and discussed in Section 1.2. Finally, Section 1.3 presents

    the structure of this document. 20

    1.1 Problem Definition

    This dissertation addresses two problems: 22

    • The lack of a robust solution that provides the user of a VR system with the possibility todirectly feel real objects, when he touches them in a VE, in a transparent way without losing 24

    the sense of immersion;

    • The lack of an alternative to the haptic interfaces that rely on devices like VR gloves or 26surfaces that resort to techniques that recreate the sense of touch. Although these devices

    are in constant research and development, they face significant limitations. 28

    1.2 Goals and Methodology

    This dissertation aims to study, develop, evaluate and integrate a solution in the MASSIVE project 30

    that will give an answer to the problems presented in the previous section.

    In order to address the problems identified above, the solution should: 32

    • Take advantage of the dominance of the user visual clues over kinesthetic clues to bring himcloser to real objects without his awareness; 34

    2

  • Introduction

    • Provide haptic feedback from real objects without making the user lose the sense of immer-sion.2

    Nowadays, most VR applications that integrate haptic feedback accomplish it by using haptic

    devices and interfaces based on actuators that attempt to recreate the touch feeling when the user4

    touches the virtual objects [SCB04]. These devices are usually VR gloves or tabletops. Our

    approach is to provide the haptic feedback directly from real objects that do not need to be mapped6

    in a 1-to-1 relation between real and VEs. This way, the user will need to be redirected to the

    object that he intends to touch in the VE. Thus, the solution should be able to combine different8

    techniques presented in the next chapter in order to fulfil the aforementioned goals.

    In order to achieve these goals, the following steps were carried:10

    1. An overview of the VR field, to gain insight on how this field has evolved and how it might

    change, focusing on relevant aspects to this dissertation.12

    2. An overview of haptic interfaces field, focusing on those useful in VEs, addressing their

    qualities and/or limitations.14

    3. A study of interaction and exploration techniques in VEs, which was used as inspiration to

    design the solution.16

    4. A study of development and evaluation guidelines for VR applications, which was used as

    basis for conceiving the solution.18

    5. A solution specification, in terms of functionality and approach, was conceived.

    6. An architecture design to support the implementation of the solution, specifying a frame-20

    work for the integration with the simulation engine.

    7. The implementation of a functional prototype for the solution specified, using a milestone-22

    based approach for development.

    8. Execution of user testing sessions, in order to validate the solution and gather insights for24

    future development.

    1.3 Document Structure26

    This document has a total of seven chapters, being this one the introduction. In chapter 2, it is

    described the state of the art, that is relevant for this dissertation.28

    In chapter 3, the proposed solution is specified and conceptualized, in terms of functionality

    and approach to implement it. In chapter 4, the architecture created to support the specified solu-30

    tion is detailed. Chapter 5 presents important implementation details of the solution and describes

    the functional prototype developed.32

    3

  • Introduction

    In Chapter 6, the user tests used to validate the solution are presented, by describing the

    methodology followed, and by analyzing and discussing their results. 2

    Finally, Chapter 7 presents the conclusions, the contributions of this dissertation and perspec-

    tives for future work. 4

    4

  • Chapter 2

    Review on Haptics in Virtual Reality2

    In this section, the state of the art research that is both useful and applicable in this dissertation is4

    described.

    The Virtual Reality area is introduced in Section 2.1, presenting a historical perspective on6

    its developments, defining it and documenting the current state of the technology. Section 2.2

    proceeds with haptic interfaces field, presenting the current solutions to deliver haptic feedback8

    and main limitations of those solutions. These two sections establish the mood and inspiration for

    the work done on this dissertation.10

    Section 2.3 introduces interaction and exploration in VR, as well as, presents an overview

    of two techniques of redirection in VEs. Finally, Section 2.4 presents some key principles and12

    guidelines for designing and evaluating VR applications, as well as introduces User-Centered

    Design and Testbed Evaluation methodologies.14

    2.1 Virtual Reality

    In 1989, Jaron Lanier formally presented the term “virtual reality”, generally being used to de-16

    scribe computer simulation applications where people can interact in a multi-sensorial immersive

    experience with a synthetized three-dimensional VE [Isd98, ZD09]. These VEs are responsive18

    and interactive in real time in such a way that the stimulating experience makes the user immerse

    in them [BM+07, BJ99, SVS05].20

    However, developments in VR had started much earlier. Some of the main achievements in

    the VR field are [SC02, ZD09]:22

    • In 1929, Edward Link developed a simple mechanical flight simulator capable of makingpeople feel like they were in real flight scenarios [Pag00];24

    • In 1962, Morton Heilig patented a design for a head-mounted video device with vibration,sound, smell and wind feedback called Sensorama [ao62];26

    5

  • Review on Haptics in Virtual Reality

    • In 1965, Sutherland presented a first VR system, which had multi-sensorial, immersion andinteraction [Sut65]; 2

    • Beginning in 1966, Sutherland and his colleagues at MIT created the first Head-MountedDisplay (HMD) and applied feedback devices, which simulated force and tactile sensations, 4

    in the system later [Sut68];

    • In 1967, University of North Carolina began research and development of force feedback 6devices that made users feel computer-simulated force [JOBK90];

    • In 1987, published papers about interfaces in VR attracted people greatly [Fol87, ZLB+87]; 8

    • In 1992, Sense8 Company developed the “WTK” library, which can provide high levelabstractions for rapid development of virtual world applications and integration of VR tech- 10

    nologies [ZD09];

    • In 1994, Burdea and Coiffet published a book about VR in which they summarized the basic 12characters of VR as 3I (imagination, interaction and immersion) [BC03a].

    Later in the nineties, the first applications of Augmented Reality (AR) appeared. Further 14

    research and development presented more and more devices that supported experiences in VEs.

    Today, 3D graphics have become ubiquitous, the hardware has improved greatly and development 16

    of VR applications with relatively great quality became possible.

    2.1.1 Definition 18

    There are different interpretations and understandings of what Virtual Reality actually is. Along-

    side with the evolution of the technology, the definition of VR has changed from a technology- 20

    based definition to one that focuses on the sensations experienced by the users [ZD09].

    Definitions based on technology tend to describe VR as a set of technological components. 22

    These components have the capability to create a VE that people can interact with naturally

    [ZD09]. These components, typically, can be output tools (visual, aural and haptic); input tools 24

    (trackers, gloves or mice); graphic rendering system and database construction and virtual object

    modelling software [BJ99, BC03b]. On the other hand, definitions based on immersion tend to 26

    describe VR as the sense of being in an environment [SBL+95]. Thus, VR is described in terms

    of the sense of presence caused in the user that experiences the virtual world. 28

    William, R. et al define four key elements needed in a VR experience: virtual world, immer-

    sion, sensory feedback and interactivity [SC02]. Also, in the same book the author defines VR 30

    as

    “a medium composed of interactive computer simulations that sense the participant’s 32

    position and actions and replace or augment the feedback to one or more senses,

    giving the feeling of being mentally immersed or present in the simulation (a virtual 34

    world)” [SC02].

    6

  • Review on Haptics in Virtual Reality

    2.1.2 Reality versus Virtuality

    There is a wide variety of systems that are used for different purposes such as fully immersive2

    VR, desktop VR, CAVE (Computer-Assisted Virtual Environment), telepresence and AR [FAB09,

    Isd98]. These systems create environments and experiences that have different classifications in4

    the Reality-Virtuality Continuum shown in Figure 2.1.

    Figure 2.1: Definition of Mixed Reality within the context of the Reality-Virtuality Contin-uum [MC99]

    Being reality defined as an environment where only real objects exist and, in the opposite6

    end, virtuality a synthetized environment where only virtual objects exist, between them there is

    mixed reality [MC99]. Mixed Reality comprises the worlds where exists incorporation of virtual8

    objects into a real environment (AR) or the inclusion of real world objects into a VE (Augmented

    Virtuality), and when close to the center of the continuum these worlds are sometimes difficult to10

    classify.

    In this line of thought, the work done in this dissertation fits in the Augmented Virtuality12

    classification. However, some of the work developed and knowledge acquired can be applied to

    Augmented Reality applications.14

    2.1.3 Current State

    The evolution in a steep rate of the hardware and software capabilities caused a rise of interest16

    in research and development of VR solutions [ZD09]. In fact, in the last decades virtual and AR

    systems evolved so much that, nowadays, we can see applications in a wide range of areas such as18

    aeronautics, military, education, medicine and rehabilitation, entertainment, engineering among

    others [ABB+01, A+97, FAB09].20

    Despite this, VR has yet a long way to go as it faces some limitations and there are areas

    where research is in the beginning. Devices that provide feedback to visual and aural senses are22

    relatively well developed in comparison with the remaining three senses (haptic, olfactory and

    gustatory) that are in their infancy and only now has there been a rise of research of solutions to24

    those senses.

    7

  • Review on Haptics in Virtual Reality

    2.1.4 Summary

    There have been great developments related to VR in the last decades, not only in the technology 2

    behind it, but also in the definition.

    Despite the evolution in software and hardware and the continuous and increasing research 4

    and development of technologies and devices that support the creation of VEs, only for visual and

    auditory senses exist solutions that are relatively well developed. The remaining three senses are 6

    in an embryonic phase of exploration.

    The next section presents an overview of the haptic interaction within VR applications. 8

    2.2 Haptic Human-Computer Interaction

    Human-computer interaction (HCI) is a field that studies a wide range of disciplines, not only 10

    computer science, but also psychology, sociology and anthropology. The definition provided by

    the ACM SIGCHI Curricula for HCI is [HBC+92]: 12

    “Human-computer interaction is a discipline concerned with the design, evaluation

    and implementation of interactive computing systems for human use and with the 14

    study of major phenomena surrounding them”

    The study of the interaction between humans and computers has gained more relevance in 16

    recent years. The amount of new technological developments has encouraged different ways of

    thinking about interaction design [PSR15]. 18

    The limitations of common 2D input devices, such as the mouse, pose major challenges when

    working in most desktop 3D applications. For this reason, more research has gone into the design 20

    and analysis of input devices for providing input in 3D applications. In particular, input devices for

    two-handed interaction in immersive VEs have been extensively researched, leading to emerging 22

    technologies that enable haptic interaction and feedback in those VEs [Bow99].

    2.2.1 Haptic Interaction 24

    The word haptic derives from Greek “hapticos” which means “touch or perceive” and refers to

    the tactile and kinesthesis (or proprioception) sense. In computation, haptic refers to the scientific 26

    field that studies the interaction between an user and a computer by the tactile sense [BBB96].

    Haptic interaction provides an additional communication channel between the user and com- 28

    puter that can be exploited to make use of the user’s haptic capabilities such as touch, force, vibra-

    tion and movement [Cho06, Sri95]. In addition, this channel can be bidirectional as beyond the 30

    information that the user sends to the computer, the computer can also provide haptic information

    about the interaction. This is what we call force feedback and can be provided with a haptic inter- 32

    face or device [PV09, SCB04]. Haptic interaction can be divided in two different classes of sensory

    information that work together to provide means to perceive haptic sensations [HACH+04, Sri95]: 34

    8

  • Review on Haptics in Virtual Reality

    • Tactile information, referring to the sense of contact with the object: this tactile feedback al-lows the user to feel the various properties of the object such as rugosity, edges, temperature2

    or slippage;

    • Kinesthetic information, referring to the sense of position and motion of limbs along with4the associated forces: in haptic interfaces this information is reproduced by force feedback,

    which allows the user to perceive the weight of grasped objects, inertia and motion con-6

    straints.

    The inclusion of haptic feedback in VEs provides a way of giving the user the sensation of8

    touching real objects when in fact he is interacting with virtual objects allowing a more realistic

    interaction between the user and the VE [RDLT, Sri95].10

    2.2.2 Haptic Interfaces

    Haptic interfaces are devices that enable manual interactions with VEs. The most primitive haptic12

    interfaces are computer keyboards, mice, trackballs that only serve the purpose of sending com-

    mands to the computer [SB97, Sri95]. However, more complex and richer haptic devices have14

    been developed in order to address the need for a better use of human haptic capabilities in im-

    mersive VEs [HACH+04, Bow99].16

    These devices intend to mimic the tactile and force feedback which exists in the real world

    by means of different mechanisms. Most of the haptic interfaces use electrical actuators, although18

    there are some devices that use hydraulic or pneumatic interfaces. In force feedback interfaces,

    there is a need of support for resistance or constraint of user actions, most of the times it is either20

    provided by desks or parts of the body in the case of gloves [Bur00, HACH+04].

    Alongside, software for haptic rendering were developed to support the development of VE22

    with support for haptic feedback. One such example of this kind of software is the OpenHaptics

    toolkit [IHZ05]. Most VR systems that integrate haptic feedback, have an architecture like the one24

    presented in Figure 2.2 [IHZ05].

    Figure 2.2: Basic high-level architecture for a virtual reality application incorporating visual, au-ditory and haptic feedback [SCB04]

    There have been many devices and concepts presented in the last decades. Joysticks are simple26

    devices in which one can feel force feedback. One of the most iconic devices is the PHANTOM

    haptic interface (Figure 2.3, right) which combines tactile and force feedback. Several VR gloves28

    9

  • Review on Haptics in Virtual Reality

    have also emerged, the DataGlove by VPL that measures the finger bending or Teleact, a glove that

    exhibits pressure and temperature feedback. CyberGrasp glove (Figure 2.3, left) integrates force 2

    feedback by restricting finger motion when grasping virtual objects. Tactile surfaces and tabletops

    also have been researched as well as interfaces for generation of 3D surfaces [BKHAK04, Bur00, 4

    HACH+04, MS94].

    Figure 2.3: The CyberGrasp glove to the left and the Phantom device to the right [Bur00,HACH+04]

    The research and development of haptic interfaces has been increasingly used in VR and tele- 6

    operation applications and spans a wide range of fields, both commercial and research oriented.

    Some of these fields are telerobotics and teleoperation, medicine and rehabilitation, education, 8

    training, military, entertainment, education, industry manufacturing, scientific research among

    others [Bur00, HACH+04, VPDL12]. 10

    2.2.3 Limitations of Haptic Interfaces

    In spite of the progress achieved so far, the integration of haptic feedback into VEs faces sev- 12

    eral challenges and limitations. Further improvements in range, resolution and frequency band-

    width both in terms of forces and displacements are needed to match the performance of human 14

    users [Bur00, SB97, Sri95]. In addition, haptic devices remain the most difficult and costly to

    build slowing the research and development in this field [KASA08]. 16

    The current haptic interfaces can only provide stimuli that approximate our interactions with

    real environments. Although it can be enough to provide a relatively realistic feeling, it only can 18

    be achieved because we keep multiple clues consistent in the VE. In other words, the limitations

    of human visual and proprioceptive perceptual apparatus, such as domination of the visual clues 20

    over proprioceptive clues, is exploited in order to deliver that realistic feel [Sri95].

    At the same time, our understanding of how human perception and performance works is quite 22

    limited. Better knowledge in this field of VR applications would be useful. As such, approaches

    where the current VEs help research in human perception becomes necessary and, in turn, the 24

    human perception knowledge helps the designing of next-generation haptic systems [SB97].

    10

  • Review on Haptics in Virtual Reality

    2.2.4 Summary

    There are already devices that explore the user’s haptic sense as a bidirectional channel. These2

    devices, usually, resort to actuators to be capable of recreating the touching sensations or to restrict

    the movement. However, the current understanding of human haptic interaction is quite limited.4

    Hence, VEs and haptic devices available nowadays present several limitations in terms of haptic

    feedback.6

    Section 2.3 presents techniques that give insight on how to explore alternatives to the devices

    presented in this section.8

    2.3 Exploration and Interaction in Virtual Environments

    Locomotion in immersive VEs is often performed with devices such as joysticks, wands or other10

    controllers. However, such setups present an unnatural way of navigation through the VE and

    often degrade the virtual experience [BIPS12]. A crucial problem in VEs is the development of12

    natural ways for exploring these spaces. Studies confirm that the use of the user’s body, with real

    proprioceptive sensations is correlated to a significant increase in the sense of presence felt during14

    the experience [SMM98, SUS94, UAW+99].

    Walking inside a VE enhances the sense of being present, naturalness and task performance16

    [CVC+12]. However, in most situations the freedom of walking is restricted by physical barriers,

    as the space available can be small or inappropriate. Therefore, problems with setup of large VEs18

    arise and, in addition, concerns related to user safety [CVC+12].

    Research in locomotion through VEs has been done in two directions: the development of20

    wide area trackers so the user can really walk or the development of body-active surrogates for

    walking such as treadmills, bicycles, wheelchairs, roller skates, walking-in-place and redirection22

    techniques [SMM98, SUS94, UAW+99].

    Just like walking, research shows that using real objects to provide haptic feedback to the24

    user can significantly enhance the sense of presence in a virtual experience [Ins01]. THE VOID

    Simulator is an example of a VR system that applies this knowledge and makes use of direct26

    haptics for interaction in the VE [Jen15].

    2.3.1 Redirected Walking28

    Studies show that perceptual visual clues often dominate over proprioceptive and vestibular clues

    when the senses disagree [Ber02, DB78]. Researchers also found that users tend to compensate30

    these inconsistencies with their body, making possible the redirection of the user’s path in the real

    world even though it differs from their virtual path [BIPS12, RKW01].32

    Razzaque, S. et al demonstrated a technique called Redirected Walking where the users could

    walk through very large VEs [RKW01]. Exploiting the limits of the human perception of vi-34

    sual, proprioceptive and vestibular clues the technique introduces a rotational distortion to the

    virtual scene that varies with the position, orientation, velocity and angular velocity in the real36

    11

  • Review on Haptics in Virtual Reality

    environment. The amount of distortion injected is calculated in order not to be detected by the

    user [RKW01]. An example of the results of this technique is presented in Figure 2.4. 2

    Figure 2.4: Difference between the virtual path (above) and real path (below) of the user usingRedirected Walking [RKW01]

    Further research introduced distortion in translational and curvature movements to achieve a

    higher efficiency in using the real spatial availability [FV04, NSE+12, SBRH08]. In addition, 4

    studies have been developed that analyze the user sensitivity to redirection and estimate detection

    thresholds. These studies also suggest that the user is more sensible to rotational distortions when 6

    in motion [NSE+10, SBJ+10, SBK+08].

    More generally applicable algorithms to redirect throughout the VE while walking towards ob- 8

    jects or evading obstacles have also been researched [KBMF05, SBK+08, SLF+12]. Furthermore,

    there were studies that integrated passive haptic feedback with VEs where redirected walking was 10

    applied. These studies explored the redirection of users to real objects that are modelled in the

    VE in two distinguishable ways: with waypoints and target locations that the user must reach or 12

    predicting the locations that the user wants to reach [KBMF05, SBK+08].

    2.3.2 Redirected Touching 14

    A single real object can be used to provide feedback for multiple virtual objects as they might

    be replicated or in different locations [Koh09]. However, research has shown that it is possible 16

    to deceive the user into perceiving that he is touching a virtual object that has small differences

    relatively to the real one [Koh09, RV64]. 18

    12

  • Review on Haptics in Virtual Reality

    Figure 2.5: Difference between the virtual finger location (corner) and the real location (in red)after warping space [Koh10]

    Luv Kohli performed a study where the users would perceive small amounts of distortion

    modelled in a virtual table when touching a flat table in the real world. The study reported that2

    in similar a way, small distortions could be introduced in the virtual objects relatively to the real

    objects and make the latter as source of haptic feedback without the user perception [Koh09].4

    Further research by the same author, suggested that the virtual space can be warped in such

    a way that is possible to remap the correspondences between the real and virtual objects. In6

    other words, the virtual space can be deformed in relation to the real space, and maintaining

    all the perceptual clues consistent the user will perform the given tasks without perceiving the8

    deformation or losing his sense of immersion [Koh13, Koh10]. An example of this technique is

    presented in Figure 2.5.10

    Results of the analysis of training, adaptation and task performance indicate that after adap-

    tation the users can perform tasks in a warped space as well as in an unwarped one [KWB12,12

    KWB13].

    2.3.3 Summary14

    The literature suggests that using the body actively as well as implementing passive haptics can

    enhance significantly the experience of the users in VEs.16

    13

  • Review on Haptics in Virtual Reality

    Application of these principles proved successful and the techniques presented exploit limi-

    tations in perceptual apparatus, enabling benefits towards the development of VEs when the re- 2

    sources do not meet the necessary requirements. In addition, haptic feedback delivered from real

    objects without users losing the sense of presence was achieved with positive results, which sug- 4

    gest that is possible to develop an interaction framework based on direct haptics.

    Section 2.4 presents methodologies, principles and guidelines that help the design and devel- 6

    opment of interaction interfaces in VR applications.

    2.4 Interaction Design in Virtual Reality 8

    Compared to conventional interfaces, VR interfaces present a richer and more complex envi-

    ronment [TJ01]. Therefore, designing of VR interfaces poses significant challenges as three- 10

    dimensional interaction is not well understood [HvDG94, TJ01].

    Most people are used to keyboards and mice to interact with computers, devices that are not 12

    developed for an immersive VE and which present an easier way of interacting with applications

    than the ones that are used in VEs [BJH+01, Bow99]. Therefore, extreme care must be taken 14

    when designing interaction techniques and user interfaces for VEs in order to make them use-

    ful and usable [BJH+01, Bow99]. The same way, the interaction evaluation in immersive VEs 16

    poses a greater challenge compared to the usability studies that are performed in the common user

    interfaces of 2D and 3D applications [Bow99, BIPS12, Bow97]. 18

    As such, following established principles, guidelines and best practices is particularly im-

    portant when during design, implementation and testing phases of development of a VR sys- 20

    tem [Bow99]. The next sections present specific established guidelines and methodologies in the

    context of development of novel interaction techniques in VR. 22

    2.4.1 Interaction and Manipulation Design Guidelines

    Publishing sets of guidelines for user interfaces, interaction techniques and similar contexts is a 24

    common tradition amongst the HCI community. These guidelines help designers creating systems

    with interaction that are usable and perform well. However, most of guidelines and principles for 26

    VE interaction tend to be too generic or adapted from 2D HCI guidelines, not ensuring that their

    application will result in a well-performing system [Bow99]. 28

    Bowman introduced a specific and practical set of principles and guidelines, which were val-

    idated in laboratory and in deployed systems, to help the design of interaction in immersive VEs. 30

    The set of guidelines encompassed interaction, travel, selection and manipulation techniques. The

    principles and guidelines relevant to the scope of this dissertation are [Bow99, BH97]: 32

    • Do not assume that techniques based on a natural, real-world metaphor will be the mostintuitive or that they will have the best performance, as his study confirmed that techniques 34

    closer to natural mapping often exhibited serious usability problems.

    14

  • Review on Haptics in Virtual Reality

    • Provide redundant interaction techniques for a single task, as users may comprehend com-plex techniques easily and intuitively while others may never become fully comfortable.2

    • Use ray-casting techniques if speed of remote selection is a requirement, due to their effi-ciency compared to arm-extension techniques.4

    • Ensure that the chosen selection technique integrates well with the manipulation techniqueto be used, so a seamless transition between the selection and manipulation techniques is6

    achieved.

    • If possible, design the environment to maximize the perceived size of objects, unless the8application requires a precise replication of the real world environment.

    • Reduce the number of degrees of freedom to be manipulated if the application allows it and10provide general or application-specific constraints or manipulation aids, in order to reduce

    the complexity of the interaction from the user’s point of view.12

    • Allow direct manipulation with the virtual hand instead of using a tool, as studies confirmedthat it leads to greater efficiency in performance and user satisfaction.14

    • Avoid repeated, frequent scaling of the user or environment during manipulation, as it cancause discomfort in users.16

    • Use indirect depth manipulation for increased efficiency and accuracy, such as allowingthe user to look closer when the objects are far from him, which leads to more efficient18

    performance.

    2.4.2 User-Centered Design20

    User-Centered Design (UCD) is a philosophy that puts its emphasis on people, rather than tech-

    nology [ND86]. It aims to design systems that allows users to achieve their goals in a particular22

    context of use [K+11]. ISO 9241-210 standard (“Human-centred Design for Interactive Systems”)

    describes six principles that characterize UCD [K+11]:24

    1. The design of interactive systems is based upon an explicit understading of users, tasks and

    environments.26

    2. Users are involved throughout the design and development.

    3. Design is driven and refined by user-centered evaluation.28

    4. The process is iterative.

    5. The design addresses the whole user experience.30

    6. The design team should be multidisciplinary.

    As the process is iterative, evaluation periods with users are needed alongside with the process32

    of design and development of a solution.

    15

  • Review on Haptics in Virtual Reality

    2.4.3 Testbed Evaluation for Interaction Techniques

    Testbed evaluation is a powerful tool for assessment of VEs interaction. It aims to find generic per- 2

    formance characteristics of interaction techniques in VEs. Testbed evaluation previews a method-

    ology for evaluation of user interaction based upon a set of tasks and environments [BJH+01, 4

    Bow97]. The framework for this methodology is described in Figure 2.6.

    Figure 2.6: Evaluation approach in testbed evaluation [BJH+01]

    The first step is to create a taxonomy of interaction techniques for the tasks, and identify the 6

    subtasks and components that accomplish them. These taxonomy and categorization also helps

    guiding the design process of the system, as it allows a deeper understanding of the requirements. 8

    We then identify the useful performance metrics that must be taken into account such as accuracy

    or completion time of the task, as well as the usability metrics such as simplicity of interaction, the 10

    sense of presence felt or the ease of learning. Outside factors must also be considered as there can

    exist constraints related to the system, tasks, environment and/or user [BGH02, BJH+01, Bow97]. 12

    This evaluation method produces a set of both objective and subjective results for the per-

    formance metrics and tasks specified, as well as a set of guidelines to guide future research and 14

    development.

    2.4.4 Summary 16

    Following established methodologies and guidelines is particularly important when designing

    novel interaction interfaces in VR systems. 18

    16

  • Review on Haptics in Virtual Reality

    The UCD philosophy enable a setup for design and development in a fast and flexible iterative

    process, which allows the production of a system that can be closer to what the user might expect.2

    Furthermore, the testbed evaluation enables the generalization of the results obtained, which might

    be useful for the development of next-generation VEs.4

    These guidelines and methodologies will be followed in the conception and implementation

    phases of the solution specified in chapter 3.6

    17

  • Review on Haptics in Virtual Reality

    18

  • Chapter 3

    Solution Specification – Direct Haptics2Framework

    4

    This chapter presents an overview of the solution proposed in this dissertation to the problem

    identified in section 1.1, supported on the research on the state of the art done in Chapter 2. The6

    title of the Chapter is a reference to the solution here proposed, which is a framework that makes

    use of direct haptics in order to provide haptic feedback to the users in a VE. An overview of the8

    solution is presented in Section 3.1, as well as a more detailed overview of its main components.

    The approach adopted for the development of the presented solution is described in Sec-10

    tion 3.2, detailing the methodologies and the work plan followed in this dissertation.

    Finally, the functionality of the solution is present in the form of features, detailed on Sec-12

    tion 3.3. In Section 3.4 the hardware requirements are briefly described and the setup illustrated.

    3.1 Solution Overview14

    Aiming to tackle the problems identified in section 1.1, the following question that serves as a

    guide to develop the proposed solution is stated:16

    Is it possible to include direct haptics and redirection techniques in a VE to trans-

    parently provide haptic feedback to the user without him perceiving that is being18

    manipulated?

    Minding that passive haptics can significantly enhance virtual experiences we propose that20

    rather than resorting to devices that try to recreate or simulate the sense of touch, direct haptics

    should be used or, in other words, the haptic feedback should be delivered to the user from real22

    objects when he touches the equivalent ones in the VE with his hands [Ins01]. This behavior

    is illustrated in Figure 3.1. As can be seen in the left, the user upon touching the virtual object24

    19

  • Solution Specification – Direct Haptics Framework

    Figure 3.1: A user receiving haptic feedback directly from a sphere (right) coherent with what hesees in the VE (left)

    receives haptic feedback directly from the real one in the right, which the system ensures that it is

    in the expected position so that the feedback is accurate and reliable. 2

    Considering a virtual environment that is a 1-to-1 mapping of a real environment – in which a

    user can touch the real objects – upon touching the user will receive haptic feedback from the real 4

    object that should be in the same position as the virtual one. This should suffice to attend the need

    of transparently providing haptic feedback to the user from real objects. 6

    However, there are situations where it is not possible to have 1-to-1 mapping between real

    objects and virtual objects because the real available space does not meet the necessary require- 8

    ment. Hence, after being in the virtual environment, what the user sees through the head mounted

    display is not necessarily the same that he has in front of him in the real world. In this situation, a 10

    need for reorientation of the user relative to the real objects that he intends to touch arises.

    In Chapter 2, we state that it is possible to exploit the dominance of visual clues over proprio- 12

    ceptive clues to redirect the real path of the user when walking through the environment [BIPS12,

    RKW01]. In the same way, it should be possible to redirect the user to objects that are in relatively 14

    different positions between real and virtual environments, without him perceiving that he (or the

    object) is being manipulated. [Koh10] 16

    As such, it should be possible to warp the virtual space in order to place virtual objects in a

    greater spatial distribution than the real spatial distribution [Koh13, Koh10]. This way, the user is 18

    led to perceive a greater or smaller freedom of movement in the virtual space even if this is not the

    case for the real space. 20

    3.1.1 Architecture Overview

    As previously stated, the solution integrates direct haptics, meaning that it does not make use 22

    of a haptic device capable of haptic rendering to provide haptic feedback to the users. Thus,

    in opposition of the architecture shown in Subsection 2.2.2, our solution would eliminate the 24

    need of a Haptic Device and the Haptic Rendering Module and would introduce Hand Tracking,

    Object Tracking and Redirection Modules as well as a need of Hand Tracking and Object Tracking 26

    20

  • Solution Specification – Direct Haptics Framework

    Devices. For a better understanding on how these fit together in that architecture, an updated

    diagram of that architecture is presented in Figure 3.2.2

    Figure 3.2: Updated diagram of the architecture of the solution

    A brief description of the purpose of Redirection, Hand Tracking and Object Tracking Modules

    is present in Sections 3.1.2, 3.1.3 and 3.1.4 respectively. The remaining modules are also present4

    in the solution, but as they are not the focus of this dissertation, they will not be discussed in detail.

    In addition, a more detailed architecture specification is presented in Chapter 4.6

    3.1.2 Redirection

    The purpose of Redirection Module is to change the user’s hands and virtual objects positions8

    according to a distortion pattern defined for the VE. By distortion pattern, we mean the mapping

    between the real world space and the virtual world space. Figure 3.3 illustrates an example of10

    mapping according to a non-linear two-dimensional distortion pattern, which can be seen as the

    correspondence between two polygonal meshes1.12

    Figure 3.3: Example of mapping for a non-linear two-dimensional distortion pattern

    1A polygonal mesh is a collection of vertices, edges and faces that defines a shape, used to simplify rendering of 3Dmodels in computer graphics.

    21

  • Solution Specification – Direct Haptics Framework

    The green mesh has no distortion since it represents the real world space and each 2D coor-

    dinate in the mesh corresponds to a 3D coordinate in the real world. In a similar way, every 2D 2

    coordinate on the red mesh maps to a 3D coordinate in the virtual world.

    The positions of user’s hands or objects — input which comes from the Hand Tracking and 4

    Object Tracking Modules (see Sections 3.1.3 and 3.1.4) — is represented by the black line in the

    green mesh. The new position that the user will see in the VE should be the corresponding 2D 6

    coordinate of the red mesh, represented by the blue line.

    The previous explanation considers a 2D distortion pattern, however it is possible to have 1D 8

    and 3D distortions patterns also. While a 1D distortion pattern should be, for example, a distortion

    applied in only the Z axis, a 3D distortion pattern should be a volumetric pattern. 10

    The output positions of this Module are the virtual offset positions that the user will see in the

    VE. It is important to note that only one pattern can exist at a time in the VE, as redirection for 12

    user’s hands and all objects has to be coherent. In other words, the same pattern ensures that when

    the user approaches an object to grab it, his hand and the object will suffer the same distortion. 14

    3.1.3 Hand Tracking

    The purpose of the Hand Tracking Module is to assure an accurate representation of the user’s 16

    hands inside the VE. As such, it is responsible for conducting an accurate detection and tracking

    of the user’s hands and their articulations position, rotation and scale. For a correct representation 18

    of the virtual position this Module should interact with the Redirection Module in order to get the

    offset position according to the distortion pattern in the scene (See Section 3.1.2). 20

    As the haptic feedback is delivered directly from real objects – upon contact with them in the

    VE – there is no need for the hand tracking module to have haptic rendering capabilities. 22

    3.1.4 Object Tracking

    The purpose of the Object Tracking Module, like the Hand Tracking, is to assure an accurate 24

    representation of the real objects – which the user is able to interact with – inside the VE. As such,

    it should be capable of: 26

    • Recognizing the real environment and performing a calibration step, so that the position,rotation and scale of the objects in the real world space is coherent with the one displayed 28

    in the VE;

    • Detecting and tracking the real objects and provide so that they can be represented in the 30VE. For a correct representation of the virtual position this Module should interact with the

    Redirection Module in order to get the offset position according to the distortion pattern in 32

    the scene (See Section 3.1.2).

    22

  • Solution Specification – Direct Haptics Framework

    3.2 Methodology

    For the implementation of the solution, a User-Centered Design approach is adopted, as described2

    in Section 2.4.2. In order to rapidly implement a robust solution, a milestone-based approach was

    adopted were each milestone targeted the implementation of the different features of the solution.4

    Each milestone had a development cycle divided in four main stages as shown in Figure 3.4.

    Figure 3.4: Development cycle of a milestone

    During the analysis stage, a brief research and elicitation of the requirements for the features to6

    implement was done. The design stage was then carried designing the architecture of the features

    and how they integrate with the work previously done, taking into account restrictions and limita-8

    tions. During the implementation stage the features were added to the system. During evaluation

    stage, brief evaluations of the solution were performed by users (in early iterations these users10

    were developers or people familiar with the solution) to assess that all requirements are met and

    there are no apparent design flaws. If major flaws were detected and changes in the design of the12

    architecture were needed, a retrospective was done and the process begins from the design stage.

    The work done on this dissertation can be grouped into seven major phases:14

    • State of the Art Review, which resulted in Chapter 2 of this dissertation, and set the founda-tions and principles in which this dissertation was developed;16

    • Architecture Specification, in which the initial architecture of the system was specified. Thisarchitecture is presented in Chapter 4;18

    • Milestone I, focused on implementing a base architecture, which integrated hand trackingand delivered a first functional prototype. Users were able to be in a VE, see their hands and20

    interact with virtual objects;

    • Milestone II, focused on implementing the redirection system. The system was able to apply22a redirection of the user’s hands according to the redirection pattern defined in the system;

    • Milestone III focused on implementing the object tracking system, which concluded the24functional proof of concept, the system was able to track an object and coherently represent

    it in the VE according to the redirection pattern defined in the system. As such, the users26

    were able to grab the real object, seeing a coherent behavior in the VE;

    • Milestone IV focused the implementation of the test system, which enabled the validation28of the functional proof of concept with users.

    23

  • Solution Specification – Direct Haptics Framework

    • User Testing, in which the results of the four Milestones were validated with users, as de-scribed in Chapter 6. 2

    3.3 Functionality

    This section presents an overview of the features specified for the solution presented. It is im- 4

    portant to say that, given the scope of this dissertation and the assumption that this work is to be

    continued, only a minimum viable proof of concept was implemented in the functional prototype. 6

    Table 3.1: List of features

    Module ID Name

    Hand Tracking

    F-01 Space CalibrationF-02 Hand DetectionF-03 Hand TrackingF-04 Hand RepresentationF-05 Detect Touch InteractionF-06 Detect Pinch InteractionF-07 Detect Grab InteractionF-08 Modify Attributes

    Object Tracking

    F-09 Space CalibrationF-10 Object DetectionF-11 Object TrackingF-12 Object RepresentationF-13 Support for Multiple ObjectsF-14 Modify Attributes

    Redirection

    F-15 Create 1D PatternF-16 Create 2D PatternF-17 Create 3D PatternF-18 Modify Pattern AttributesF-19 Remove Pattern

    The features presented in this section focus on specifying a complete solution, which would

    be implemented as a fully-featured framework to be integrated in MASSIVE project, given the 8

    appropriate resources and time. In addition, the specified features are only for Hand Tracking,

    Object Tracking and Redirection Modules – as they are the focus of the differentiating aspect of 10

    the solution – and are listed in Table 3.1, being described in more detail in Appendix A.

    3.3.1 Functional Prototype 12

    The functional prototype developed in this dissertation served to test the hypothesis stated in the

    beginning of the Section 3.1. This prototype is a first iteration of the solution and for that reason the 14

    implementation comprised only a subset of the features specified (See Section 3.3). In addition, a

    system to test and validate the prototype – which is not part of the solution – was implemented. 16

    24

  • Solution Specification – Direct Haptics Framework

    Figure 3.5: Possible physical setup of the hardware

    The prototype developed in this dissertation was implemented addressing the basic function-

    ality needed, not focusing the details. This happened because there was a need to experiment2

    different functionality and make decisions based on the early results, so by not focusing on the de-

    tails we could implement the prototype iteratively and adaptable to change, with room for further4

    development and improvement. Nonetheless, the functional prototype had to be robust enough to

    give the user a natural experience as expected in virtual reality applications. As such, the func-6

    tional prototype created has good foundations for further development.

    3.4 Hardware Requirements and Setup8

    In order to implement this solution, additionally to the normal hardware in a VR system, the

    specified solution needs tracking devices for both the hands of the user and the objects. The10

    solution does not require exactly two different devices, as the same tracking device could be used.

    Although it is valid, this case will not be discussed in this dissertation. Figure 3.5 illustrates a12

    possible physical setup of the hardware as used throughout the development and testing of the

    functional prototype.14

    3.5 Summary

    An initial specification of the solution was proposed, supported by the state of the art research con-16

    ducted in Chapter 2. The scope of this dissertation assumes continued research and development

    25

  • Solution Specification – Direct Haptics Framework

    and therefore this specification has a greater scope than the scope of this dissertation, which only

    focuses on developing a functional prototype that serves as a proof of concept. 2

    Three main modules were identified for this solution: the Hand Tracking, which enables the

    tracking of the user’s hands; the Object Tracking, which enables the tracking of objects; and the 4

    Redirection that enables the reorientation of both user’s hands and objects according to the defined

    distortion pattern. Based on UCD, an approach for developing this solution was described. Finally, 6

    features and hardware requirements and setup for the solution were described, in order to explain

    the relevant concepts of the solution. 8

    Chapter 4 details the architecture that supports the solution proposed in this chapter.

    26

  • Chapter 4

    Solution Architecture2

    In this chapter, an initial detailed specification of the architecture of the solution is presented. This4

    architecture was conceived following the guidelines detailed in Section 2.4.1.

    The architecture here proposed tries to capture the important base concepts of the solution6

    with a high level of abstraction. By high level of abstraction, we mean that hardware and algo-

    rithmic choices and their consequent constraints that may require changes or extensions to this8

    architecture, will not be taken into account.

    Figure 4.1 illustrates the five main modules of the solution (Redirection, Hand Model, Object10

    Model, Hand Tracking and Object Tracking) and their base class (Object):

    Figure 4.1: Domain model overview

    • Objects, detailed in Section 4.2, the base class that every module derives from;12

    • Redirection, detailed in Section 4.3, the module responsible for managing and providing thedistortion data in a VE;14

    27

  • Solution Architecture

    • Object Models, detailed in Section 4.4, which are the virtual representation of the real ob-jects that the user can interact with; 2

    • Hand Models, detailed in Section 4.5, which are the virtual representation of the hands ofthe user; 4

    • Object Tracking, detailed in Section 4.6, which is the module responsible for detecting andtracking objetcs; 6

    • Hand Tracking, detailed in Section 4.7, which is the module responsible for detecting andtracking user’s hands. 8

    An overview of how these modules fit together in a VR system is presented in Section 4.1.

    4.1 Framework 10

    Figure 4.2 illustrates how the five main modules of the solution fit together in a working VR

    system. 12

    Figure 4.2: High-level architecture overview of the Framework

    Throughout all the experience, the user interacts and receives haptic feedback directly from

    real objects while visual and audio feedback is computer generated. 14

    As can be seen, the Hand and Object Tracking Devices feed data to the Hand and Object

    Tracking Controllers respectively. Both controllers parse the input and trigger the changes in the 16

    objects which they detect and track. When the user interacts with an object in the real world, also

    in the VE the visual hand interacts with the virtual objects. 18

    28

  • Solution Architecture

    RedirectionManager holds the distortion pattern defined for a given VE and provides both

    HandModel and ObjectModel objects with the distortion offsets of their positions accordingly to2

    that pattern.

    Finally, the main modules of the framework are controlled by the Simulation Engine in order4

    to adjust the necessary properties, so that the desired behavior of the system is achieved.

    4.2 Objects6

    Figure 4.3: Objects – Domain model overview

    Figure 4.3 shows a diagram of the Objects Domain, and its relationships. The Object is the

    base class that every module derives from. Every Object has a Transform that holds information8

    on the object’s position, rotation and scale.

    A Model is an object that can be seen in the VE like the user’s hands or the objects that he can10

    interact with. As such, a Model has a GraphicsModel that holds the shape mesh of the object in

    order to be rendered, and a PhysicsModel that holds the collision mesh of the object in order to12

    detect collisions between Models. It also can exhibit UpdateBehaviors that manage the changes

    of the object in the VE throughout an experience.14

    A TrackingController is an object that is responsible for controlling and managing how Models

    are tracked. For this, it can exhibit ProcessingBehaviors that process the data that comes from the16

    sensors and triggers changes in the objects. The TrackingController also has references to the

    objects that it detects and tracks.18

    4.3 Redirection

    Figure 4.4 shows a diagram of the Redirection Domain, and its relationships. The Redirection-20

    Manager is responsible for creating and managing the distortion pattern and mechanisms for a

    given VE, as well as, providing distortion data to ObjectModel and HandModel modules (See22

    Section 4.4 and 4.5).

    29

  • Solution Architecture

    Figure 4.4: Redirection – Domain model overview

    The Pattern is the object that contains the specification and properties of a distortion pattern.

    There are three kinds of Patterns (1D Pattern, 2D Pattern and 3D Patterns) being their difference 2

    the number of dimensions of the pattern. The RedirectionManager, being an object, also has a

    Transform that specifies the root position and transformations of the pattern in the VE. 4

    The distortion data given to the ObjectModel and HandModel modules should be the offset

    position – relative to their real positions – that the objects should be in, according to the distortion 6

    pattern.

    4.4 Object Models 8

    Figure 4.5: Object Model – Domain model overview

    Figure 4.5 shows a diagram of the Object Models Domain, and its relationships. An Object-

    Model is the virtual representation of a real object that the user can interact with. 10

    Deriving from Model, the ObjectModel has an ObjectUpdateBehavior that is responsible for

    updating the object’s transform according to: 12

    • Data received from the ObjectTrackingController, which is position, rotation and scale ofthe object in the VE; 14

    • The offset position by the RedirectionManager, which is added to the position given by thecontroller; 16

    • Collisions that may occur with another objects in the VE;

    • Interaction information, if a hand is interacting with the object (See Section 4.5). 18

    30

  • Solution Architecture

    This behavior should also implement assumption mechanisms so that when the tracking con-

    troller loses track of the object, updates – based on those assumptions – to its Transform are still2

    executed, until the object is destroyed after a timeout.

    4.5 Hand Models4

    Figure 4.6: Hand Model – Domain model overview

    Figure 4.6 shows a diagram of the Hand Models Domain, and its relationships. A HandModel

    is the virtual representation of the user’s tracked hands. A HandModel is the palm model and has6

    five FingerModels – its fingers, which are composed by various articulations – and both inherit

    from Model, as they are virtual objects that can be seen in the VE.8

    Like ObjectModel, both HandModel and FingerModel have an UpdateBehavior, which is re-

    sponsible for updating their Transforms according to data received from the HandTrackingCon-10

    troller and collisions that can occur with other objects. In addition, the HandUpdateBehavior has

    to update the hand position according with the offset received from RedirectionManager.12

    In a similar way to ObjectUpdateBehavior, HandUpdateBehavior and FingerUpdateBehavior

    should also implement assumptions mechanisms so that when the tracking controller loses track14

    of the hand, updates – based on those assumptions – to their Transforms are still executed, until

    the objects are destroyed after a timeout.16

    As the user has the ability interact with objects, each HandModel has an InteractionRecognizer

    associated that analyses the hand’s pose and triggers the InteractionBehavior that corresponds to18

    that pose. The InteractionBehavior has one or more ObjectModels associated, as one hand can

    interact with more than one object at a time. As of now – if required, further interactions can be20

    specified in the future – there are three types of InteractionBehavior :

    • TouchInteraction, a simple or complex iteration with objects, triggered by a collision of the22hand or its fingers with the objects;

    31

  • Solution Architecture

    • PinchInteraction, a pinch gesture to pick up an object, triggered by a pinch pose of the handnear the object; 2

    • GrabInteraction, a grab gesture to pick up one or more objects, triggered by a grab posenear at least one object. 4

    These interactions