skinput

18
SKINPUT Appropriating Body as the Input Surface Presented By- Karan Sharma ECE – B.Tech III Year

Upload: karan-sharma

Post on 31-Oct-2014

2 views

Category:

Technology


1 download

DESCRIPTION

A presentation on Skinput By Karan Sharma,(B.Tech - ECE 3rd Year), I.T.S. Engineering College.

TRANSCRIPT

Page 1: Skinput

SKINPUT Appropriating Body as the Input Surface

Presented By- Karan Sharma ECE – B.Tech III Year

Page 2: Skinput

Introduction:Advances in ElectronicsMobile devices are becoming extra smallLimited user interaction SpaceRequirement of large user interaction space without losing the primary benefit of small sizeAlternative approaches that enhance interactions with small mobile systems

Page 3: Skinput

One that happens to always travel with us: Our Skin Proprioception & Easily Accessible

Page 4: Skinput

Basic Principle:Skinput is an input technology that uses Bio-acoustic sensing to localize finger taps on the skin.

Tap With

Finger

Bio-Acoustic array of sensors

Pico-Projec-

tor

When augmented with a Pico-Projector, the device can provide a direct manipulation , graphical user interface on the body. The technology was developed by Chris Harrison, Desney Tan & Dan Morris, at Microsoft Research’s Computational User Experiences Group.

Page 5: Skinput

Why Bio-Acoustic Sensing?EEG and FNIR :

Brain sensing Technologies such as Electroencephalography (EEG) and Functional Near-Infrared Spectroscopy (FNIR) have been used by HCI researchers to access cognitive and emotional state. This work also primarily looked involuntary signals.BCI :Direct Brain Computer Interfaces (BCI’s) generally used for paralyzed still lack the bandwidth required for everyday computing tasks and require level of focus, training and concentration that are incompatible with typical computer interaction.EMG :Electrical signals generated by muscle activation during normal hand movement through electromyography.

Page 6: Skinput

Bio-Acoustics:Transverse Wave Propagation :

Finger impacts displace the skin, creating transverse waves (ripples). The sensor is activated as the wave passes underneath.Longitudinal Wave Propagation:

Finger impacts create longitudinal (compressive) waves that cause skeletal structures to vibrate. This, in turn, creates longitudinal waves that emanate outwards from the bone (along it’s entire length) towards the skin.

Page 7: Skinput

Interactive Capabilities:

Variations in bone density , size and mass and the soft tissue and joints create Acoustically different locations.

Page 8: Skinput

Playing Tetris : Using fingers as control pad

Connect to any mobile device like iPod

Page 9: Skinput

Projection of a dynamic graphical interface

Finger inputs are classified and processed in Real Time

Page 10: Skinput

Sensing :A wearable Bio-Acoustic sensing array built into an armband. Sensing elements detect vibrations transmitted through the body. The two sensor packages shown above each contain five, Specially weighted, cantilevered piezo films, responsive to a particular frequency range.

Page 11: Skinput

Arm Band Prototype:

Page 12: Skinput

Processing:

Ten channels of Acoustic data generated by three finger taps on the forearm, followed by three taps on the wrist. The exponential average of the channels is shown in red. Segmented input windows are highlighted in green. Note how different sensing elements are actuated by the two locations.

Page 13: Skinput

Input Locations Considered :

Designed software listens for impacts and classifies them. Then different interactive capabilities are bounded on different regions.

Page 14: Skinput

Accuracy:Five Fingers :Classification accuracy remained high for the five finger condition, averaging 87.7% across participants.

Whole Arm :The below-elbow placement performed the best, posting a 95.5% average accuracy. This is not surprising , as this condition placed the sensors closer to the input targets than the other conditions. Moving the sensor above the elbow reduced accuracy to 88.3% . The eyes free input condition yielded lower accuracies than other conditions, averaging 85.0% .

Page 15: Skinput

Forearm :Classification accuracy for the Ten-location forearm condition stood at 81.5% , a surprisingly strong result.

Higher accuracies can be achieved by collapsing the input locations into groups. A-E and G were created using a design-centric strategy. F was created following analysis of per-location accuracy data.

Page 16: Skinput

Walking And Jogging :The testing phase took roughly 3 minutes to complete (Four trials total : Two participants, Two conditions ). The male walked at 2.3 mph and jogged at 4.3 mph; the female at 1.9 mph and 3.1 mph respectively. In both walking trials, the system never produced a false positive input. Meanwhile true positive accuracy was 100 % . Classification accuracy was 100% for male and 86% for female . In jogging this showed 100% true positive accuracy. Classification accuracy was 100% for male and 83.4% for female .

Page 17: Skinput

BMI Considered :The prevalence of fatty tissues and the density/mass of bones tend to dampen or facilitate the transmission of acoustic energy in the body.The participants with the three highest BMI’s (29.2, 29.6 and 31.9 – representing borderline obese to obese) produced the three lowest accuracies.

Page 18: Skinput

Conclusion:Skinput represents a novel, wearable bio-acoustic sensing array that was built in an armband in order to detect and localize finger taps on the forearm and hand. Results from the experiments have shown that system performs very well for a series of gestures, even when the body is in motion which improves in further work.