building ar and vr experiences

Post on 14-Apr-2017

2.957 Views

Category:

Technology

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

BUILDING AR AND VR EXPERIENCES

Mark Billinghurst mark.billinghurst@unisa.edu.au

May 10th – 13th, 2016 Xi’an, China

Mark Billinghurst ▪  University of South Australia

▪  Past Director of HIT Lab NZ, University of Canterbury

▪  PhD Univ. Washington

▪  Research on AR, mobile HCI, Collaborative Interfaces

▪  More than 300 papers in AR, VR, interface design

AR/VR Course • Lectures

•  2:30 - 4:30 pm everyday •  Lectures/hands-on

• Logistics • Bring your own laptop if possible • Use Android phone • Share computer/phone

• Material • All material available for download

What You Will Learn • AR/VR fundamentals + history • Basics of Unity Programming • How to make Panorama VR Applications • How to create VR Scenes • How to add Interactivity to VR Applications • Using the Vuforia AR tracking library • Creating AR scenes • Adding AR interactivity • Design guidelines • Research directions

Schedule • Tuesday May 10th

•  Introduction, Learning Unity, Building 360 VR scenes

• Wednesday May 11th

• Creating 3D scenes, adding interactivity, good design

• Thursday May 12th •  Introduction to AR, Vuforia basics, building AR scenes •  Field trip to ARA demo space

• Friday May 13th

• Adding interactivity, advanced AR tracking, research

ARA Demos

Hololens

CoolGlass

HTC Vive And more …

INTRODUCTION

A Brief History of Time

• Trend •  smaller, cheaper, more functions, more intimate

• Technology becomes invisible •  Intuitive to use •  Interface over internals • Form more important than function • Human centered design

A Brief History of Computing

• Trend • smaller, cheaper, faster, more intimate, intelligent objects

• Computers need to become invisible • hide the computer in the real world

• Ubiquitous / Tangible Computing • put the user inside the computer

• Virtual Reality

Making Interfaces Invisible

Rekimoto, J. and Nagao, K. 1995. The world through the computer: computer augmented interaction with real world environments. In Proceedings of the 8th Annual ACM Symposium on User interface and Software Technology. UIST '95. ACM, New York, NY, 29-36.

Graphical User Interfaces

• Separation between real and digital worlds • WIMP (Windows, Icons, Menus, Pointer) metaphor

Ubiquitous Computing

• Computing and sensing embedded in real world •  Particle devices, RFID, motes, arduino, etc

Virtual Reality

• Immersive VR • Head mounted display, gloves • Separation from the real world

Augmented Reality

1977 – Star Wars

Augmented Reality Definition

• Defining Characteristics [Azuma 97] • Combines Real and Virtual Images

• Both can be seen at the same time • Interactive in real-time

• The virtual content can be interacted with • Registered in 3D

• Virtual objects appear fixed in space

Azuma, R. T. (1997). A survey of augmented reality. Presence, 6(4), 355-385.

2008 - CNN

AR vs VR

From Reality to Virtual Reality

Ubiquitous Computing Augmented Reality Virtual Reality

Milgram’s Reality-Virtuality continuum

Mixed Reality

Reality - Virtuality (RV) Continuum

Real Environment

Augmented Reality (AR)

Augmented Virtuality (AV)

Virtual Environment

"...anywhere between the extrema of the virtuality continuum."

P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual Displays IEICE Transactions on Information and Systems, E77-D(12), pp. 1321-1329, 1994.

VIRTUAL REALITY

Virtual Reality

Computer generated multi-sensory simulation of an artificial environment that is interactive and immersive.

David Zeltzer’s AIP Cube

! Autonomy – User can to react to events and stimuli.

! Interaction – User can interact with objects and environment.

! Presence – User feels immersed through sensory input and output channels

Interaction

Autonomy

Presence

VR

Zeltzer, D. (1992). Autonomy, interaction, and presence. Presence: Teleoperators & Virtual Environments, 1(1), 127-132.

Key Technologies • Autonomy

• Head tracking, body input •  Intelligent systems

•  Interaction • User input devices, HCI

• Presence • Graphics/audio/multisensory output • Multisensory displays

•  Visual, audio, haptic, olfactory, etc

Early Experimenters (1950’s – 80’s)

Helig 1956

Sutherland 1965

Furness 1970’s

Ivan Sutherland HMD

The First Wave (1980’s – 90’s)

NASA 1989 VPL 1990’s

Virtuality 1990’s

Jaron Lanier

•  Founded VPL, coined term “Virtual Reality”

Desktop VR - 1995 • Expensive - $150,000+ •  2 million polys/sec • VGA HMD – 30 Hz • Magnetic tracking

Second Wave (2010 - ) • Palmer Luckey

• HMD hacker • Mixed Reality Lab (MxR)

• Oculus Rift (2011 - ) •  2012 - $2.4 million kickstarter •  2014 - $2B acquisition FaceBook •  $350 USD, 110o FOV

•  sddg

Oculus Rift

Sony Morpheus

HTC/Valve Vive

2016 - Rise of Consumer HMDs

Desktop VR 2016 • Graphics Desktop

• $1,500 USD • >4 Billion poly/sec

• $600 HMD • 1080x1200, 90Hz

• Optical tracking • Room scale

https://immersivelifeblog.files.wordpress.com/2015/04/vr_history.jpg

Market Size

Computer Based vs. Mobile VR

Mobile VR

CPU: 300 Mhz HDD; 9GB RAM: 512 mb Camera: VGA 30fps Graphics: 500K poly/sec

1998: SGI O2 2008: Nokia N95

CPU: 332 Mhz HDD; 8GB RAM: 128 mb Camera: VGA 30 fps Graphics: 2m poly/sec

Mobile Phone AR & VR • Mobile Phone AR

• Mobile phone • Live camera view • Senor input (GPS, compass)

• Mobile Phone VR • Mobile phone • Senor input (compass) • Additional VR viewer

VR2GO (2013)

• MxR Lab •  3D print VR viewer for mobiles • Open source hardware + software •  http://projects.ict.usc.edu/mxr/diy/vr2go/

Multiple Mobile VR Viewers Available

•  zxcvz

CARDBOARD VR

•  dsfsaf

Google Cardboard

• Released 2014 (Google 20% project) • >5 million shipped/given away • Easy to use developer tools

+ =

Cardboard ($2)

Lenses ($10)

Magnets ($6)

Velcro ($3)

Rubber Band (1¢)

Software

Components

Assembling the Cardboard Viewer

Version 1.0 vs Version 2.0

• Version 1.0 – Android focused, magnetic switch, small phone • Version 2.0 – Touch input, iOS/Android, fits many phones

Many Different Cardboard Viewers

SAMPLE CARDBOARD APPLICATIONS

Cardboard App • 7 default experiences

• Earth: Fly on Google Earth

•  Tour Guide: Visit sites with guides

• YouTube: Watch popular videos

• Exhibit: Examine cultural artifacts

• Photo Sphere: Immersive photos

• Street View: Drive along a street

• Windy Day: Interactive short story

100’s of Google Play Cardboard apps

Sample Applications

Cardboard Camera

• Capture 360 panoramas • Stitch together images on phone • View in VR on Cardboard

Google Expeditions

• Teacher led VR experiences • https://www.google.com/edu/expeditions/

Building Your Own Application • Cardboard Viewer

•  https://www.google.com/get/cardboard/

• Smart phone • Android/iOS

• Cardboard SDK •  iOS, Android, Unity •  https://developers.google.com/cardboard/

• Unity game engine (optional) •  https://unity3d.com

• Content

Cardboard SDK

Features:1.  Lensdistor-oncorrec-on.2.  Headtracking.3.  3Dcalibra-on.4.  Side-by-siderendering.5.  Stereogeometryconfigura-on.

6.  Userinputeventhandling.

Unity Cardboard SDK

INTRODUCTION TO UNITY

Unity 3D Game Editor

SETUP

Download and Install • Go to unity3d.com/download • Use Download Assistant – pick components you want

Getting Started •  First time running Unity you’ll be asked to create a project • Specify project name and location • Can pick asset packages (pre-made content)

Unity Interface •  Toolbar, Scene, Hierarchy, Project, Inspector

Customizable Interface

Building Scenes • Use GameObjects:

• Containers that hold different components •  Eg 3D model, texture, animation

• Use Inspector • View and edit object properties and other settings

• Use Scene View • Position objects, camera, lights, other GameObjects etc

• Scripting • Adding interaction, user input, events, etc

GameObjects • Every object in Scene is a GameObject • GameObjects contain Components

•  Eg Transform Component, Camera Component

Adding 3D Content

• Create 3D asset using modeling package, or download •  Fbx, Obj file format for 3D models

• Add file to Assets folder in Project • When project opened 3D model added to Project View • Drag mesh from Project View into Hierarchy or Scene View

•  Creates a game object

Positioning/Scaling Objects

• Click on object and choose transform

Unity Asset Store

• Download thousands models, scripts, animations, etc •  https://www.assetstore.unity3d.com/

UNITY BASICS

Making a Simple Scene 1.  Create New Project 2.  Create Game Object 3.  Moving main camera position 4.  Adding lights 5.  Adding more objects 6.  Adding physics 7.  Changing object materials 8.  Adding script behaviour

CreateProject

• Create new folder and project

New Empty Project

Create GameObject

•  Load a Sphere into the scene • GameObject -> 3D Object -> Sphere

Moving main camera

• Select Main Camera • Select translate icon • Move camera

Add Light

• GameObject -> Light -> Directional Light • Use inspector to modify light properties (colour, intensity)

Add Physics

•  Select Sphere •  Add Rigidbody component

•  Add Component -> Physics -> RigidBody •  or Component -> Physics -> RigidBody

•  Modify inspector properties (mass, drag, etc)

Add More Objects

• Add several cubes •  GameObject -> 3D Object – Cube

• Move cube • Add Rigid Body component (uncheck gravity)

Add Material

• Assets -> Create -> Material • Click Albedo colour box in inspector • Select colour • Drag asset onto object to apply

Add Script

• Assets -> Create -> C# script • Edit script using Mono • Drag script onto Game Object

Example C# Script GameObject Rotation

using UnityEngine; using System.Collections; public class spin : MonoBehaviour {     // Use this for initialization     void Start () {         }         // Update is called once per frame     void Update () {         this.gameObject.transform.Rotate(Vector3.up*10);     } } #

Scripting C# Unity 3D •  void Awake():

•  Is called when the first scene is loaded and the game object is active

•  void Start(): •  Called on first frame update

•  void FixedUpdate(): •  Called before physics calculations are made

•  void Update(): •  Called every frame before rendering

•  void LateUpdate(): •  Once per frame after update finished

Final Spinning Cube Scene

BUILDING AR AND VR EXPERIENCES

Mark Billinghurst mark.billinghurst@unisa.edu.au

May 10th – 13th Xi’an

LECTURE 2: VR SCENES

IMMERSIVE PANORAMAS

Types of VR Experiences •  Immersive Spaces

•  360 Panorama’s/Movies • High visual quality •  Limited interactivity

•  Changing viewpoint orientation

•  Immersive Experiences •  3D graphics

•  Lower visual quality • High interactivity

•  Movement in space •  Interact with objects

Immersive Panorama

• High quality 360 image or video surrounding user • User can turn head to see different views •  Fixed position

Demo: Cardboard Camera

• Capture 360 panoramas • Stitch together images on phone • View in VR on Cardboard

Example Applications • VRSE – Storytelling for VR

•  http://vrse.com/ • High quality 360 VR content

• New York Times VR Experience • NYTVR application • Documentary experiences

• Vrideo •  http://vrideo.com/ • Streamed immersive movies

Vrideo Website – vrideo.com

Capturing Panorama • Stitching photos together

•  Image Composite Editor (Microsoft) • AutoPano (Kolor)

• Using 360 camera • Ricoh Theta-S •  Fly360

Image Composite Editor (Microsoft)

•  Free panorama stitching tool •  http://research.microsoft.com/en-us/um/redmond/projects/ice/

AutoPano (Kolor)

•  Finds image from panoramas and stitches them together •  http://www.kolor.com/autopano/

Steps to Make Immersive Panorama 1.  Create a new project 2.  Load the Cardboard SDK 3.  Load a panorama image asset 4.  Create a Skymap 5.  Add to VR scene 6.  Deploy to mobile phone Need

• Google Cardboard SDK Unity package • Android SDK to install on Android phone

New Project

Load Cardboard SDK

• Assets -> Import Package -> Custom Package •  Navigate to CardboardSDKForUnity.unitypackage

• Uncheck iOS (for Android build)

Load Cardboard Main Camera

• Drag CardboardMain prefab into Hierarchy •  Assets -> Cardboard -> Prefab

• Delete CameraMain

Panorama Image Asset

•  Find/create suitable panorama image •  Ideally 2K or higher resolution image

• Google “Panorama Image Cubemap”

Add Image Asset to Project • Assets -> Import Asset

•  Select desired image

• Set Texture Type to Cubemap

• Set mapping to Latitude-Longitude (Cylindrical)

Create Skybox Material

• Assets -> Create -> Material • Name material • Set Shader to Skybox -> Cubemap • Drag texture to cubemap

Create Skybox • Window -> Lighting • Drag Skybox material into

Skypebox form

Panorama Image in Unity

One Last Thing..

• CardboardMain -> Head -> Main Camera • Set Clear Flags to Skybox

Test It Out

• Hit play, use alt/option key + mouse to look around

Deploy to Mobile (Android) 1.  Plug phone into USB

• make sure device in debug mode 2.  Set correct build settings 3.  Player settings

• Other settings • Set Bundle Idenitfier -> com.Company.ProductName

• Resolution and Presentation • Default Orientation -> Landscape Left

4.  Build and run

Deploying to Phone 1.  Plug phone into USB 2.  Open Build Settings 3.  Change Target platform to Android 4.  Select Player Settings 5.  Resolution and Presentation

•  Default Orientation -> Landscape Left 6.  Under Other Settings

•  Edit Bundle Identifier – eg com.UniSA.cubeTest •  Minimum API level

7.  Build and Run •  Select .apk file name

Running on Phone

• Droid@Screen View on Desktop

Making Immersive Movie • Create movie texture

•  Convert 360 video to .ogg or ,mp4 file •  Add video texture as asset

• Make Sphere •  Equirectangular UV mapping •  Inward facing normals •  Move camera to centre of sphere

•  Texture map video to sphere •  Easy Movie Texture ($65)

• Apply texture to 3D object

•  For 3D 360 video •  Render two Spheres •  http://bernieroehl.com/360stereoinunity/

BUILDING AR AND VR EXPERIENCES

Mark Billinghurst mark.billinghurst@unisa.edu.au

May 10th – 13th Xi’an

LECTURE 3: 3D SCENES

CREATING 3D ENVIRONMENTS

3D Virtual Environments

• Viewing content in true 3D • Moving/interacting with scene

Example: Cardboard Tuscany Drive

• Place viewer inside 3D scene • Navigate by head pointing

Key Steps 1.  Creating a new project 2.  Load Cardboard SDK 3.  Replace camera with CardboardMain 4.  Loading in 3D asset packages 5.  Loading a SkyDome 6.  Adding a plane floor

New Project

• Camera replaced with CameraMain

Download Model Package

• Magic Lamp from 3dFoin • Search on Asset store

Load Asset + Add to Scene

• Assets -> Import Package -> Custom Package •  Look for MagicLamp.unitypackage (If not installed already)

• Drag MagicLamp_LOD0 to Hierarchy • Position and rotate

Import SkySphere package

• SkySphere Volume1 on Asset store •  Import SkySphere package

Add SkySphere to Scene

• Drag Skyball_WithoutCap into Hierarchy •  SkySphere_V1 -> Meshes

• Rotate and Scale as needed

Add Ground Plane

• GameObject -> 3D Object -> Plane • Set Scale X to 2.0, Z to 2.0

Testing View

• Use alt/option key plus mouse to rotate view

Adding More Assets

•  Load from Asset store – look for free assets •  Assets -> Import Package -> Custom Package

Final Scene

ADDING INTERACTIVITY

Moving through space

• Move through looking •  Look at target to turn on/off moving

• Button/tapping screen • Being in a vehicle (e.g. Roller Coaster)

Adding Movement Goal: Move in direction user looking when Cardboard Button pressed or screen touched

• Key Steps 1.  Start with static screen 2.  Create movement script 3.  Add movement script to Camera head 4.  Deploy to mobile

Static Scene

Create Movement Script • Add new script object

•  Assets -> Create -> C# Script

• Edit script in Mono

Add Script to Scene

• Drag Script onto Head object •  CameraboardMain -> Head

• Uncheck Track Position Box • Adjust movement speed

Gaze Interaction

• Cause events to happen when looking at objects •  Look at target shoot it

Steps • Add physics ray caster

• Casts a ray from camera • Add function to object to respond to gaze

• Eg particle effect • Add event trigger to object • Add event system to scene • Add collider object to target object

Adding Physics Raycaster

• Select Main Camera • CardboardMain -> Head -> Main Camera

• Add Physics Raycaster Component • Add component -> Physics Raycaster

Add Gaze Function

• Select target object (Lamp) • Add component -> script

• Add stareAtLamp() public function

Add Event Trigger

• Select Target Object (Lamp) • Add component

•  EventTriger •  Add New Event Type -> PointerExit

• Add object to event •  Hit ‘+’ tag •  Drag Lamp object to box under RuntimeOn

• Select Function to run •  Select function list -> scroll to stareAtLamp

Adding Event System

• Need Event System for trigger to work • Select Lamp object

• UI -> Event System

• Add gazeInputModule • Add component -> Cardboard -> Gaze Input Module

Add Collider to Object

• Need to detect when target being looked at •  Select Lamp Object

• Add Sphere Collider •  Add component -> Sphere Collider (type in “Sphere”)

• Adjust position and radius of Sphere Collider if needed

Add Gaze Event • Particles triggered looking at lamp • Add particle system

• Add Component -> Particle System • Pick colour •  set Emission rate to 0

• Add code to stareAtLamp() function • GetComponent<ParticleSystem>().Emit(10); •  Turns particle system on when looked at

Gaze Demo

• Particles launched from Lamp when looked at

Adding More Interactivity

•  Load Cardboard Demo application •  Assets -> Import Package -> Custom Package •  Load CardboardDemoForUnity.unitypackage

•  Launch Demo Scene •  Assets -> Cardboard -> DemoScene

Features Shown

• Gaze reticle + selection • Viewpoint teleportation • Menu panel overlay • Audio feedback • Event system

DESIGN GUIDELINES

Google Design Guidelines

• Google’s Guidelines for good VR experiences: • Physiological Considerations •  Interactive Patterns

•  Setup •  Controls •  Feedback •  Display Reticle

•  From http://www.google.com/design/spec-vr/designing-for-google-cardboard/a-new-dimension.html

Physiological Considerations • Factors to Consider

• Head tracking • User control of movement • Use constant velocity • Grounding with fixed objects • Brightness changes

Interactive Patterns - Setup • Setup factors to consider:

• Entering and exiting • Headset adaptation • Full Screen mode • API calls •  Indicating VR apps

Interactive Patterns - Controls

• Use fuze buttons for selection in VR

Interactive Patterns - Feedback

• Use audio and haptic feedback • Reduce visual overload • Audio alerts • 3D spatial sound • Phone vibrations

Interactive Patterns - Display Reticle

• Easier for users to target objects with a display reticle • Can display reticle only when near target object • Highlight objects (e.g. with light source) that user can target

Cardboard Design Lab Application

• Use Cardboard Design Lab app to explore design ideas

BUILDING AR AND VR EXPERIENCES

Mark Billinghurst mark.billinghurst@unisa.edu.au

May 10th – 13th Xi’an

LECT. 4: AUGMENTED REALITY

1977 – Star Wars – Augmented Reality

Augmented Reality Definition

• Defining Characteristics [Azuma 97] • Combines Real and Virtual Images

• Both can be seen at the same time • Interactive in real-time

• The virtual content can be interacted with • Registered in 3D

• Virtual objects appear fixed in space

Azuma, R. T. (1997). A survey of augmented reality. Presence, 6(4), 355-385.

2008 - CNN

•  Put AR pictures here

Augmented Reality Examples

Where Can You Use AR/VR?

Milgram’s Reality-Virtuality continuum

Mixed Reality

Reality - Virtuality (RV) Continuum

Real Environment

Augmented Reality (AR)

Augmented Virtuality (AV)

Virtual Environment

"...anywhere between the extrema of the virtuality continuum."

P. Milgram and A. F. Kishino, Taxonomy of Mixed Reality Visual Displays IEICE Transactions on Information and Systems, E77-D(12), pp. 1321-1329, 1994.

Summary

• Augmented Reality has three key features • Combines Real and Virtual Images •  Interactive in real-time • Registered in 3D

• AR can be classified alongside other technologies • Milgram’s Mixed Reality continuum

TECHNOLOGY

Augmented Reality Definition

• Defining Characteristics • Combines Real and Virtual Images

• Display Technology • Interactive in real-time

• Interaction Technology • Registered in 3D

• Tracking Technology

DISPLAY

Display Technologies

" Types (Bimber/Raskar 2003) " Head attached

•  Head mounted display/projector " Body attached

•  Handheld display/projector " Spatial

•  Spatially aligned projector/monitor

TRACKING

Objects Registered in 3D

• Registration • Positioning virtual object wrt real world

• Tracking • Continually locating the users viewpoint

•  Position (x,y,z), Orientation (r,p,y)

Tracking Technologies #  Active

•  Mechanical, Magnetic, Ultrasonic •  GPS, Wifi, cell location

#  Passive •  Inertial sensors (compass, accelerometer, gyro) •  Computer Vision

•  Marker based, Natural feature tracking

#  Hybrid Tracking •  Combined sensors (eg Vision + Inertial)

Tracking Types

Magnetic Tracker

Inertial Tracker

Ultrasonic Tracker

Optical Tracker

Marker-Based Tracking

Markerless Tracking

Specialized Tracking

Edge-Based Tracking

Template-Based Tracking

Interest Point Tracking

Mechanical Tracker

INTERACTION

• Interface Components • Physical components • Display elements

• Visual/audio • Interaction metaphors

Physical Elements

Display Elements Interaction

Metaphor Input Output

AR Interface Elements

AR Design Space

Reality Virtual Reality

Augmented Reality

Physical Design Virtual Design

AR APPLICATIONS

• Web based AR •  Flash, HTML 5 based AR •  Marketing, education

• Outdoor Mobile AR •  GPS, compass tracking •  Viewing Points of Interest in real world •  Eg: Junaio, Layar, Wikitude

• Handheld AR •  Vision based tracking •  Marketing, gaming

•  Location Based Experiences •  HMD, fixed screens •  Museums, point of sale, advertising

Typical AR Experiences

CityViewAR Application

•  Visualize Christchurch before the earthquakes

User Experience

• Multiple Views

• Map View, AR View, List View

• Multiple Data Types • 2D images, 3D content, text, panoramas

Warp Runner

• Puzzle solving game • Deform real world terrain

Demo: colAR

• Turn colouring books pages into AR scenes • Markerless tracking, use your own colours..

• Try it yourself: http://www.colARapp.com/

What Makes a Good AR Experience?

• Compelling • Engaging, ‘Magic’ moment

• Intuitive, ease of use • Uses existing skills

• Anchored in physical world • Seamless combination of real and digital

USING VUFORIA Mark Billinghurst

mark.billinghurst@unisa.edu.au

What you will learn •  Introduction to Vuforia

•  Platform and features

• How to install/set-up Vuforia • Vuforia Basics

•  Marker Tracking, Object tracking

• Deploying to Mobile Device •  Android, iOS

OVERVIEW

Vuforia Overview • Platform for Mobile Computer Vision

•  https://www.qualcomm.com/products/vuforia

• Released by Qualcomm in 2010, acquired by PTC 2015 • Used by over 200K developers, >20K applications

• Main Features: • Recognition

•  Image, text, object recognition

•  Tracking •  Image, marker, scene, object

Vuforia Provides

•  Android•  iOS•  UnityExtension

DeviceSDK

•  TargetManagementSystem•  AppDevelopmentGuide•  VuforiaWebServices

Tools&Services

•  Dedicated technical support engineers

•  Thousands of posts SupportForum

Vuforia Features

Tracking Targets

Image

Object

Environment

Developer Tools

Target Manager

Cloud Services

Platform Anatomy

User Experiences Enabled

INSTALLATION

Download Vuforia for Unity SDK •  https://developer.vuforia.com/downloads/sdk

Download Samples •  https://developer.vuforia.com/downloads/samples

Installing Vuforia Unity Extension • Create new Unity Project •  Import the Vuforia Unity Extension

• Double clicking the *.unitypackage file •  Eg vuforia-unity-5-0-6.unitypackage

• Manually install package •  Assets -> Import Package -> Custom Package

• The extension archive will self install •  folders, plugins and libraries, etc

Imported Vuforia Assets

Unity Asset Structure • Editor - Contains the scripts required to

interact with Target data in the Unity editor • Plugins - Contains Java and native binaries

that integrate the Vuforia AR SDK with the Unity Android or Unity iOS application

• Vuforia - Contains the prefabs and scripts required to bring AR to your application

• Streaming Assets / QCAR - Contains the Device Database configuration XML and DAT files from the online Target Manager

USING VUFORIA

Setting up a Vuforia Project • Register as Developer • Create a Project • Obtain a License Key • Load Vuforia package into Unity • Add license key to AR Camera • Add Tracking Targets • Move ImageTarget into Scene • Add sample object to ImageTarget

Register as Developer •  https://developer.vuforia.com/user/register

Download Vuforia Packages • Go to download URL – log in

•  https://developer.vuforia.com/downloads/sdk

• Download Current SDK for Unity •  vuforia-unity-5-5-9.unitypackage

• Download Core Features Sample •  vuforia-samples-core-unity-5-5-9.zip

Create License Key •  https://developer.vuforia.com/targetmanager/licenseManager/licenseListing

Obtain a License Key • Vuforia 5.5 apps utilize a license key that uniquely identifies

each app. License keys are created in the License Manager

•  The steps to creating a new license key are.. •  Choose a SDK •  Choose a licensing option based on your requirements •  Provide your Billing Information if you've chosen to use a paid license •  Obtain your license Key

License Key Generated – Save This

Load Vuforia Package

• Open Unity •  Load package

•  Assets -> Import Package -> Custom Package •  Load vuforia-unity-5-5-9 (or current version)

• Note: •  On Windows Vuforia only works with 32 bit version of Unity •  You may need to download Unity 32 version to work

Add License Key to Vuforia Project •  Open ARCamera Inspector in Vuforia

•  Assets -> Vuforia -> Prefabs •  Move AR Camera to scene hierarchy (Delete Main Camera) •  Paste License Key

Adding Tracking Targets • Create a target on the Target Manager

•  https://developer.vuforia.com/targetmanager/

• OR - Use existing targets from other projects

Which Type of Database • Device Database vs. Cloud Database? • Device: local, Cloud: online

Creating a Target • Create a database • Add targets

Selecting Target Type

Sample Tracking Images

Loaded Image Target

• Rating indicates how good a target • Download Dataset -> create unity package

• Eg StoneImage.unitypackage

Loading the Tracking Image

•  Import tracking dataset package •  Assets -> Import Package -> Custom Package

• Drag ImageTarget prefab into Scene Hierarchy • Select ImageTarget, pick Data Set then Image Target

•  Set image width

• On AR Camera load target database and activate •  Database Load Behaviour

ImageTarget Loaded

Testing the Camera View

Add 3D Content • As a test, create a simple Cube object

•  GameObject > Create Other > Cube

• Add the cube as a child of the ImageTarget object by dragging it onto the ImageTarget item.

• Move the cube until it is centered on the Image Target.

AR Test View

DEPLOYING TO MOBILE

APPLICATION • Unity

• Creating the Application •  Configure the export settings and build the Application

216

Building for Android • Open Build Settings • Change Target platform to Android • Switch Platform • Under Player Settings

• Edit Bundle Identifier – eg com.UniSA.cubeTest • Minimum API level

• Build and Run • Select .apk file name

ADDING INTERACTION

Adding Interaction

•  Look at Vuforia core samples • Virtual Buttons

•  Create virtual buttons on page that triggers actions •  Eg touch button change object colour

Virtual Buttons

RESOURCES

Books • Unity Virtual Reality Projects

•  Jonathan Linowes

• Holistic Game Development with Unity •  Penny de Byl

Cardboard Resources • Google Cardboard main page

•  https://www.google.com/get/cardboard/

• Developer Website •  https://www.google.com/get/cardboard/developers/

• Building a VR app for Cardboard •  http://www.sitepoint.com/building-a-google-cardboard-vr-app-in-unity/

• Creating VR game for Cardboard •  http://danielborowski.com/posts/create-a-virtual-reality-game-for-

google-cardboard/

• Moving in VR space •  http://www.instructables.com/id/Prototyping-Interactive-Environments-

in-Virtual-Re/

Vuforia Resources • Vuforia Product Page https://www.qualcomm.com/products/vuforia • Vuforia Developer Page https://developer.vuforia.com • SDK Download Page https://developer.vuforia.com/downloads/sdk •  Installing Vuforia for Unity extension http://developer.vuforia.com/library/articles/Solution/Installing-the-Unity-Extension •  Tutorials https://developer.vuforia.com/resources/tutorials

Unity Resources • Unity Main site

• http://www.unity3d.com/ • Holistic Development with Unity

• http://holistic3d.com • Official Unity Tutorials

• http://unity3d.com/learn/tutorials • Unity Coder Blog

• http://unitycoder.com

www.empathiccomputing.org

@marknb00

mark.billinghurst@unisa.edu.au

top related