self study haptics
TRANSCRIPT
-
8/2/2019 Self Study Haptics
1/18
Introduction
Hapticsrefers to the sense of touch (from Greek = "I fasten onto, I touch"). It is a
form of nonverbal communication. Haptic technology, or haptics, is a tactile feedback
technology which takes advantage of the sense of touch by applying forces, vibrations,or motions to the user. This mechanical stimulation can be used to assist in the creation
of virtual objects in a computer simulation to control such virtual objects, and to
enhance the remote control of machines and devices (telerobotics). It has been
described as "doing for the sense of touch what computer graphics does for
vision".Haptic devices may incorporate tactile sensors that measure forces exerted by
the user on the interface.
Haptic technology has made it possible to investigate how the human sense of touch
works by allowing the creation of carefully controlled haptic virtual objects. These
objects are used to systematically probe human haptic capabilities, which would
otherwise be difficult to achieve. These research tools contribute to the understanding
of how touch and its underlying brain functions work.
Haptic Technology
History
One of the earliest applications of haptic technology was in large aircraft that use
servomechanism systems to operate control surfaces. Such systems tend to be "one-
way", meaning external forces applied aerodynamically to the control surfaces are not
perceived at the controls. Here, the missing normal forces are simulated with springsand weights. In earlier, lighter aircraft without servo systems, as the aircraft approached
a stall the aerodynamic buffeting (vibrations) was felt in the pilot's controls. This was a
useful warning of a dangerous flight condition. This control shake is not felt when servo
control systems are used. To replace this missing sensory cue, the angle of attack is
measured, and when it approaches the critical stall point a "stick shaker" (an
unbalanced rotating mass) is engaged which simulates the response of a simpler control
system. Alternatively, the servo force may be measured and the signal directed to a
servo system on the control, known asforce feedback. Force feedback has been
implemented experimentally in some excavators and is useful when excavating mixed
material such as large rocks embedded in silt or clay. It allows the operator to "feel" and
-
8/2/2019 Self Study Haptics
2/18
work around unseen obstacles, enabling significant increases in productivity.The
PHANTOM interface from SensAble Technologies was one of the first haptic systems to
be sold commercially. Its success lies in its simplicity. Instead of trying to display
information from many different points, this haptic device simulates touching at a single
point of contact. It achieves this through a stylus which is connected to a lamp-like arm.
Three small motors give force feedback to the user by exerting pressure on the stylus.So, a user can feel the elasticity of a virtual balloon or the solidity of a brick wall. He or
she can also feel texture, temperature and weight. The stylus can be customized so that
it closely resembles just about any object. For example, it can be fitted with a syringe
attachment to simulate what it feels like to pierce skin and muscle when giving a shot.
The CyberGrasp system, another commercially available haptic interface from
Immersion Corporation, takes a different approach. This device fits over the user's
entire hand like an exoskeleton and adds resistive force feedback to each finger. Five
actuators produce the forces, which are transmitted along tendons that connect the
fingertips to the exoskeleton. With the CyberGrasp system, users are able to feel the
size and shape of virtual objects that only exist in a computer-generated world. To make
sure a user's fingers don't penetrate or crush a virtual solid object, the actuators can be
individually programmed to match the object's physical properties.
Areas Of Haptic technology
If you thought the Apple iPhone was amazing, then feast your eyes -- and fingers -- on
this phone from Samsung. Dubbed the Anycall Haptic, the phone features a large touch-
screen display just like the iPhone. But it does Apple's revolutionary gadget one better,
at least for now: It enables users to feel clicks, vibrations and other tactile input. In all, itprovides the user with 22 kinds of touch sensations.
Those sensations explain the use of the term haptic in the name. Haptic is from the
Greek "haptesthai," meaning to touch. As an adjective, it means relating to or based on
the sense of touch. As a noun, usually used in a plural form (haptics), it means the
science and physiology of the sense of touch. Scientists have studied haptics for
decades, and they know quite a bit about the biology of touch. They know, for example,
what kind of receptors are in the skin and how nerves shuttle information back and
forth between the central nervous system and the point of contact.
Unfortunately, computer scientists have had great difficulty transferring this basicunderstanding of touch into their virtual reality systems. Visual and auditory cues are
easy to replicate in computer-generated models, but tactile cues are more problematic.
It is almost impossible to enable a user to feel something happening in the computer's
mind through a typical interface. Sure, keyboards allow users to type in words, and
joysticks and steering wheels can vibrate. But how can a user touch what's inside the
virtual world? How, for example, can a video game player feel the hard, cold steel of his
or her character's weapon? How can an astronaut, training in a computer simulator, feel
the weight and rough texture of a virtual moon rock?
-
8/2/2019 Self Study Haptics
3/18
Since the 1980s, computer scientists have been trying to answer these questions. Their
field is a specialized subset of haptics known as computer haptics. As a field of study,
haptics has closely paralleled the rise and evolution of automation. Before the industrial
revolution, scientists focused on how living things experienced touch. Biologists learned
that even simple organisms, such as jellyfish and worms, possessed sophisticated touch
responses. In the early part of the 20th century, psychologists and medical researchersactively studied how humans experience touch. Appropriately so, this branch of science
became known as human haptics, and it revealed that the human hand, the primary
structure associated with the sense of touch, was extraordinarily complex.
With 27 bones and 40 muscles, including muscles located in the forearm, the hand
offers tremendous dexterity. Scientists quantify this dexterity using a concept known as
degrees of freedom. A degree of freedom is movement afforded by a single joint.
Because the human hand contains 22 joints, it allows movement with 22 degrees of
freedom. The skin covering the hand is also rich with receptors and nerves, components
of the nervous system that communicate touch sensations to the brain and spinal cord.
Then came the development of machines and robots. These mechanical devices also
had to touch and feel their environment, so researchers began to study how this
sensation could be transferred to machines. The era ofmachine haptics had begun. The
earliest machines that allowed haptic interaction with remote objects were simple
lever-and-cable-actuated tongs placed at the end of a pole. By moving, orienting and
squeezing a pistol grip, a worker could remotely control tongs, which could be used to
grab, move and manipulate an object.
In the 1940s, these relatively crude remote manipulation systems were improved to
serve the nuclear and hazardous material industries. Through a machine interface,workers could manipulate toxic and dangerous substances without risking exposure.
Eventually, scientists developed designs that replaced mechanical connections with
motors and electronic signals. This made it possible to communicate even subtle hand
actions to a remote manipulator more efficiently than ever before.
The next big advance arrived in the form of the electronic computer. At first, computers
were used to control machines in a real environment (think of the computer that
controls a factory robot in an auto assembly plant). But by the 1980s, computers could
generate virtual environments -- 3-D worlds into which users could be cast. In these
early virtual environments, users could receive stimuli through sight and sound only.
Haptic interaction with simulated objects would remain limited for many years.
Then, in 1993, the Artificial Intelligence Laboratory at the Massachusetts Institute of
Technology (MIT) constructed a device that delivered haptic stimulation, finally making
it possible to touch and feel a computer-generated object. The scientists working on the
project began to describe their area of research as computer haptics to differentiate it
from machine and human haptics. Today, computer haptics is defined as the systems
required -- both hardware and software -- to render the touch and feel of virtual
objects. It is a rapidly growing field that is yielding a number of promising haptic
technologies.
-
8/2/2019 Self Study Haptics
4/18
Computer Human machine Interface
Types Of Haptic Feedback
When we use our hands to explore the world around us, we receive two types of
feedback -- kinesthetic and tactile. To understand the difference between the two,
consider a hand that reaches for, picks up and explores a baseball As the hand reachesfor the ball and adjusts its shape to grasp, a unique set of data points describing joint
angle, muscle length and tension is generated. This information is collected by a
specialized group of receptors embedded in muscles, tendons and joints.
Known as proprioceptors, these receptors carry signals to the brain, where they are
processed by the somatosensory region of the cerebral cortex. The muscle spindle is
one type of proprioceptor that provides information about changes in muscle length.
The Golgi tendon organ is another type of proprioceptor that provides information
about changes in muscle tension. The brain processes this kinesthetic information to
provide a sense of the baseball's gross size and shape, as well as its position relative tothe hand, arm and body.
When the fingers touch the ball, contact is made between the finger pads and the ball
surface. Each finger pad is a complex sensory structure containing receptors both in the
skin and in the underlying tissue. There are many types of these receptors, one for each
type of stimulus: light touch, heavy touch, pressure, vibration and pain. The data coming
collectively from these receptors helps the brain understand subtle tactile details about
the ball. As the fingers explore, they sense the smoother texture of the leather, the
raised coarseness of the laces and the hardness of the ball as force is applied. Even the
thermal properties of the ball are sensed through tactile receptors.
Force feedback is a term often used to describe tactile and/or kinesthetic feedback. As
our baseball example illustrates, force feedback is vastly complex. Yet, if a person is to
feel a virtual object with any fidelity, force feedback is exactly the kind of information
the person must receive. Computer scientists began working on devices -- haptic
interface devices -- that would allow users to feel virtual objects via force feedback.
Early attempts were not successful. But as we'll see in the next section, a new
-
8/2/2019 Self Study Haptics
5/18
generation of haptic interface devices is delivering an unsurpassed level of performance,
fidelity and ease of use.
Haptic DevicesHaptic devices (or haptic interfaces) are mechanical devices that mediate
communication between the user and the computer. Haptic devices allow users to
touch, feel and manipulate three-dimensional objects in virtual environments and tele-
operated systems. Most common computer interface devices, such as basic mice and
joysticks, are input-only devices, meaning that they track a user's physical manipulations
but provide no manual feedback. As a result, information flows in only one direction,
from the peripheral to the computer. Haptic devices are input-output devices, meaning
that they track a user's physical manipulations (input) and provide realistic touch
sensations coordinated with on-screen events (output) . Examples of haptic devices
include consumer peripheral devices equipped with special motors and sensors (e.g.,
force feedback joysticks and steering wheels) and more sophisticated devices designed
for industrial, medical or scientific applications (e.g., PHANTOM device).
Haptic interfaces are relatively sophisticated devices. As a user manipulates the end
effector, grip or handle on a haptic device, encoder output is transmitted to an interface
controller at very high rates. Here the information is processed to determine the
position of the end effector. The position is then sent to the host computer running a
supporting software application. If the supporting software determines that a reaction
force is required, the host computer sends feedback forces to the device. Actuators
(motors within the device) apply these forces based on mathematical models thatsimulate the desired sensations. For example, when simulating the feel of a rigid wall
with a force feedback joystick, motors within the joystick apply forces that simulate the
feel of encountering the wall. As the user moves the joystick to penetrate the wall, the
motors apply a force that resists the penetration. The farther the user penetrates the
wall, the harder the motors push back to force the joystick back to the wall surface. The
end result is a sensation that feels like a physical encounter with an obstacle.
The human sensorial characteristics impose much faster refresh rates for haptic
feedback than for visual feedback. Computer graphics has for many years contended
itself with low scene refresh rates of 20 to 30 frames/sec. In contrast, tactile sensors in
-
8/2/2019 Self Study Haptics
6/18
the skin respond best to vibrations higher that 300 Hz. This order-of-magnitude
difference between haptics and vision bandwidths requires that the haptic interface
incorporate a dedicated controller. Because it is computationally expensive to convert
encoder data to end effector position and translate motor torques into directional
forces, a haptic device will usually have its own dedicated processor. This removes
computation costs associated with haptics and the host computer can dedicate itsprocessing power to application requirements, such as rendering high-level graphics.
General-purpose commercial haptic interfaces used today can be classified as either
ground-based devices (force reflecting joysticks and linkage-based devices) or body-
based devices (gloves, suits, exoskeletal devices). The most popular design on the
market is a linkage-based system, which consists of a robotic arm attached to a pen (see
Figure 1). The arm tracks the position of the pen and is capable of exerting a force on
the tip of this pen. To meet the haptic demands required to fool ones sense of touch,
sophisticated hardware and software are required to determine the proper joint angles
and torques necessary to exert a single point of force on the tip of the pen. Not only is itdifficult to control force output because of the update demand, the mass of a robotic
arm introduces inertial forces that must accounted for. As a result, the cost of a linkage-
based force feedback device ranges from $10,000 to over $100,000.
Figure 1. PHANTOM Desktop from SensAble Technologies is a popular linkage-based haptic device. The position
and orientation of the pen are tracked through encoders in the robotic arm. Three degrees of force, in the x, yand
z, direction are achieved through motors that apply torques at each joint in the robotic arm.
An alternative to a linkage-based device is one that is tension-based. Instead of applying
force through links, cables are connected to the point of contact in order to exert a
force. Encoders determine the length of each cable. From this information, the position
of a grip can be determined. Motors are used to create tension in the cables, which
results in an applied force at the grip.
-
8/2/2019 Self Study Haptics
7/18
A haptic device must be able to resist the force applied by a user. Just as a lever allows
you to apply a large force with little effort, longer links in a robotic arm allow you to
exert larger forces on the motors that control the bending of the robotic arm. Larger
linkage- based systems therefore require larger motors that can exert more force. This
can become very expensive and in most cases impractical, which puts a limit on the
size of a linkage-based system. Because a tension-based device applies force directlyto the point of contact through the tension in the cables, there is no need to
compensate for the lever effect. This removes the limitation on workspace size and
the amount of force that could be applied. As a result, a tension-based device could be
the size of a mouse pad or the size of a room.
(a) (b) (c)
Figure 3. Various tension based force feedback devices developed at the Precision and Intelligence Laboratory at
the Tokyo Institute of Technology. (a) Six degree force feedback device with 54x54x54 cm workspace. (b) Twenty
four degree of freedom device with 100x50x50 cm workspace. (c) Five degree freedom device with a 2x2x2 m
workspace used within a CAVE system.
Unlike links, the cables used in a tension-based system have little mass. This
essentially eliminates inertial effects and increases the accuracy of the applied force.
Cables are also much safer than links in that they have little mass that can collide with
a user. Cables are also unobtrusive. They do not obstruct ones view, as would one or
more robotic arms. Stereoscopic 3-D projection can allow virtual objects to be placed
within the frame of a tension-based device so that visualization can be co-located with
the sense of touch.
Tension- based devices are only now beginning to appear in commercial markets. The
low cost of these systems make them especially appealing as compared to linkage-based devices. Surgery simulation is the arena where these devices are first making an
appearance, with laproscopy, trans-urethral resection of the prostate, and suturing
simulation serving as some of the initial applications.
Common interface devices like mouse and joystick are only input devices. No
feedback.
Haptic devices are input-output devices.
-
8/2/2019 Self Study Haptics
8/18
Design
Haptics are enabled by actuators that apply forces to the skin for touch feedback. The
actuator provides mechanical motion in response to an electrical stimulus. Most early
designs of haptic feedback use electromagnetic technologies such as vibratory motors,
like a vibrating alert in a cell phone or a voice coil in a speaker, where a central mass is
moved by an applied magnetic field. These electromagnetic motors typically operate at
resonance and provide strong feedback, but produce a limited range of sensations. Next
generation actuator technologies are beginning to emerge,offering a wider range of
effects due to more rapid response times. Next generation haptic actuator technologies
include electroactive polymers, piezoelectric, and electrostatic surface actuation.
Actuators and Sensors
Haptic System
There are several approaches to creating haptic systems. Although they may look
drastically different, they all have two important things in common -- software to
determine the forces that result when a user's virtual identity interacts with an object
and a device through which those forces can be applied to the user. The actual process
used by the software to perform its calculations is called haptic rendering. A common
rendering method uses polyhedral models to represent objects in the virtual world.
These 3-D models can accurately portray a variety of shapes and can calculate touch
data by evaluating how force lines interact with the various faces of the object. Such 3-D
objects can be made to feel solid and can have surface texture.
The job of conveying haptic images to the user falls to the interface device. In many
respects, the interface device is analogous to a mouse, except a mouse is a passive
device that cannot communicate any synthesized haptic data to the user. Let's look at a
few specific haptic systems to understand how these devices work.
-
8/2/2019 Self Study Haptics
9/18
Applications Of Haptic Technology
It's not difficult to think of ways to apply haptics. Video game makers have been early
adopters of passive haptics, which takes advantage of vibrating joysticks, controllers and
steering wheels to reinforce on-screen activity. But future video games will enable
players to feel and manipulate virtual solids, fluids, tools and avatars. The Novint Falcon
haptics controller is already making this promise a reality. The 3-D force feedback
controller allows you to tell the difference between a pistol report and a shotgun blast,
or to feel the resistance of a longbow's string as you pull back an arrow.
Graphical user interfaces, like those that define Windows and Mac operating
environments, will also benefit greatly from haptic interactions. Imagine being able to
feel graphic buttons and receive force feedback as you depress a button. Some
touchscreen manufacturers are already experimenting with this technology. Nokia
phone designers have perfected a tactile touchscreen that makes on-screen buttons
behave as if they were real buttons. When a user presses the button, he or she feelsmovement in and movement out. He also hears an audible click. Nokia engineers
accomplished this by placing two small piezoelectric sensor pads under the screen and
designing the screen so it could move slightly when pressed. Everything -- movement
and sound -- is synchronized perfectly to simulate real button manipulation.
Although several companies are joining Novint and Nokia in the push to incorporate
haptic interfaces into mainstream products, cost is still an obstacle. The most
sophisticated touch technology is found in industrial, military and medical applications.
Training with haptics is becoming more and more common. For example, medical
students can now perfect delicate surgical techniques on the computer, feeling what it's
like to suture blood vessels in an anastomosis or inject BOTOX into the muscle tissue of
a virtual face. Aircraft mechanics can work with complex parts and service procedures,
touching everything that they see on the computer screen. And soldiers can prepare for
battle in a variety of ways, from learning how to defuse a bomb to operating a
helicopter, tank or fighter jet in virtual combat scenarios.
Haptic technology is also widely used in teleoperation, or telerobotics. In a telerobotic
system, a human operator controls the movements of a robot that is located some
distance away. Some teleoperated robots are limited to very simple tasks, such as
aiming a camera and sending back visual images. In a more sophisticated form of
teleoperation known as telepresence, the human operator has a sense of being locatedin the robot's environment. Haptics now makes it possible to include touch cues in
addition to audio and visual cues in telepresence models. It won't be long before
astronomers and planet scientists actually hold and manipulate a Martian rock through
an advanced haptics-enabled telerobot -- a high-touch version of the Mars Exploration
Rover.
-
8/2/2019 Self Study Haptics
10/18
Teleoperators and simulators
Teleoperators are remote controlled robotic toolswhen contact forces are reproduced
to the operator, it is called haptic teleoperation. The first electrically actuated
teleoperators were built in the 1950s at the Argonne National Laboratory by Raymond
Goertz to remotely handle radioactive substances. Since then, the use of force feedbackhas become more widespread in other kinds of teleoperators such as remote controlled
underwater exploration devices.
When such devices are simulated using a computer (as they are in operator training
devices) it is useful to provide the force feedback that would be felt in actual operations.
Since the objects being manipulated do not exist in a physical sense, the forces are
generated using haptic (force generating) operator controls. Data representing touch
sensations may be saved or played back using such haptic technologies. Haptic simulators
are used in medical simulators and flight simulators for pilot training.
Computer and video games
Haptic feedback is commonly used in arcade games, especially racing video games. In
1976, Sega's motorbike gameMoto-Cross, also known asFonz,[4]
was the first game to use
haptic feedback which caused the handlebars to vibrate during a collision with another
vehicle.[5]
Tatsumi'sTX-1introduced force feedback to car driving games in 1983.
Simple haptic devices are common in the form ofgame controllers, joysticks, and steering
wheels. Early implementations were provided through optional components, such as the
Nintendo 64 controller'sRumble Pak. Many newer generation console controllers and
joysticks feature built in feedback devices, including Sony's DualShock technology. Someautomobile steering wheel controllers, for example, are programmed to provide a "feel"
of the road. As the user makes a turn or accelerates, the steering wheel responds by
resisting turns or slipping out of control.
In 2007, Novint released the Falcon, the first consumer 3D touch device with high
resolution three-dimensional force feedback; this allowed the haptic simulation of
objects, textures, recoil, momentum, and the physical presence of objects in games.
Personal computers
In 2008, Apple's MacBook and MacBook Pro started incorporating a "Tactile Touchpad"
design[9][10]
with button functionality and haptic feedback incorporated into the tracking
surface.[11]
Products such as the Synaptics ClickPad[12]
followed thereafter.
Mobile devices
Tactile haptic feedback is becoming common in cellular devices. Handset manufacturers
like LG and Motorola are including different types of haptic technologies in their devices;
in most cases, this takes the form of vibration response to touch. Alpine Electronics uses a
haptic feedback technology named PulseTouch on many of their touch-screen car
http://en.wikipedia.org/wiki/Teleoperationhttp://en.wikipedia.org/wiki/Argonne_National_Laboratoryhttp://en.wikipedia.org/wiki/Raymond_Goertzhttp://en.wikipedia.org/wiki/Raymond_Goertzhttp://en.wikipedia.org/wiki/Simulationhttp://en.wikipedia.org/wiki/Flight_simulatorhttp://en.wikipedia.org/wiki/Arcade_gamehttp://en.wikipedia.org/wiki/Racing_video_gamehttp://en.wikipedia.org/wiki/Segahttp://en.wikipedia.org/wiki/Fonz_%28arcade%29http://en.wikipedia.org/wiki/Fonz_%28arcade%29http://en.wikipedia.org/wiki/Fonz_%28arcade%29http://en.wikipedia.org/wiki/Fonz_%28arcade%29http://en.wikipedia.org/wiki/Fonz_%28arcade%29http://en.wikipedia.org/wiki/Haptic_technology#cite_note-Fonz-3http://en.wikipedia.org/wiki/Haptic_technology#cite_note-Fonz-3http://en.wikipedia.org/wiki/Haptic_technology#cite_note-Fonz-3http://en.wikipedia.org/wiki/Haptic_technology#cite_note-4http://en.wikipedia.org/wiki/Haptic_technology#cite_note-4http://en.wikipedia.org/wiki/Tatsumi_%28company%29http://en.wikipedia.org/wiki/TX-1http://en.wikipedia.org/wiki/TX-1http://en.wikipedia.org/wiki/TX-1http://en.wikipedia.org/wiki/Game_controllerhttp://en.wikipedia.org/wiki/Nintendo_64http://en.wikipedia.org/wiki/Rumble_Pakhttp://en.wikipedia.org/wiki/Rumble_Pakhttp://en.wikipedia.org/wiki/Rumble_Pakhttp://en.wikipedia.org/wiki/Sonyhttp://en.wikipedia.org/wiki/DualShockhttp://en.wikipedia.org/wiki/Novinthttp://en.wikipedia.org/wiki/Novint#Novint_Falconhttp://en.wikipedia.org/wiki/MacBookhttp://en.wikipedia.org/wiki/MacBook_Prohttp://en.wikipedia.org/wiki/Haptic_technology#cite_note-8http://en.wikipedia.org/wiki/Haptic_technology#cite_note-8http://en.wikipedia.org/wiki/Haptic_technology#cite_note-10http://en.wikipedia.org/wiki/Haptic_technology#cite_note-10http://en.wikipedia.org/wiki/Haptic_technology#cite_note-10http://en.wikipedia.org/wiki/Synapticshttp://en.wikipedia.org/wiki/Haptic_technology#cite_note-11http://en.wikipedia.org/wiki/Haptic_technology#cite_note-11http://en.wikipedia.org/wiki/Mobile_phonehttp://en.wikipedia.org/wiki/LG_Corphttp://en.wikipedia.org/wiki/Motorolahttp://en.wikipedia.org/wiki/Alpine_Electronicshttp://en.wikipedia.org/wiki/Alpine_Electronicshttp://en.wikipedia.org/wiki/Motorolahttp://en.wikipedia.org/wiki/LG_Corphttp://en.wikipedia.org/wiki/Mobile_phonehttp://en.wikipedia.org/wiki/Haptic_technology#cite_note-11http://en.wikipedia.org/wiki/Synapticshttp://en.wikipedia.org/wiki/Haptic_technology#cite_note-10http://en.wikipedia.org/wiki/Haptic_technology#cite_note-8http://en.wikipedia.org/wiki/Haptic_technology#cite_note-8http://en.wikipedia.org/wiki/MacBook_Prohttp://en.wikipedia.org/wiki/MacBookhttp://en.wikipedia.org/wiki/Novint#Novint_Falconhttp://en.wikipedia.org/wiki/Novinthttp://en.wikipedia.org/wiki/DualShockhttp://en.wikipedia.org/wiki/Sonyhttp://en.wikipedia.org/wiki/Rumble_Pakhttp://en.wikipedia.org/wiki/Nintendo_64http://en.wikipedia.org/wiki/Game_controllerhttp://en.wikipedia.org/wiki/TX-1http://en.wikipedia.org/wiki/Tatsumi_%28company%29http://en.wikipedia.org/wiki/Haptic_technology#cite_note-4http://en.wikipedia.org/wiki/Haptic_technology#cite_note-Fonz-3http://en.wikipedia.org/wiki/Fonz_%28arcade%29http://en.wikipedia.org/wiki/Fonz_%28arcade%29http://en.wikipedia.org/wiki/Segahttp://en.wikipedia.org/wiki/Racing_video_gamehttp://en.wikipedia.org/wiki/Arcade_gamehttp://en.wikipedia.org/wiki/Flight_simulatorhttp://en.wikipedia.org/wiki/Simulationhttp://en.wikipedia.org/wiki/Raymond_Goertzhttp://en.wikipedia.org/wiki/Raymond_Goertzhttp://en.wikipedia.org/wiki/Argonne_National_Laboratoryhttp://en.wikipedia.org/wiki/Teleoperation -
8/2/2019 Self Study Haptics
11/18
navigation and stereo units.[13]
The Nexus One features haptic feedback, according to their
specifications.[14]
Virtual reality
Haptics are gaining widespread acceptance as a key part ofvirtual reality systems, addingthe sense of touch to previously visual-only solutions. Most of these solutions use stylus-
based haptic rendering, where the user interfaces to the virtual world via a tool or stylus,
giving a form of interaction that is computationally realistic on today's hardware. Systems
are being developed to use haptic interfaces for 3D modeling and design that are
intended to give artists a virtual experience of real interactive modeling. Researchers from
the University of Tokyo have developed 3D holograms that can be "touched" through
haptic feedback using "acoustic radiation" to create a pressure sensation on a user's
hands (see future section). The researchers, led by Hiroyuki Shinoda, had the technology
on display at SIGGRAPH 2009 in New Orleans.
Research
Research has been done to simulate different kinds oftaction by means of high-speed
vibrations or other stimuli. One device of this type uses a pad array of pins, where the pins
vibrate to simulate a surface being touched. While this does not have a realistic feel, it
does provide useful feedback, allowing discrimination between various shapes, textures,
and resiliencies. Several haptics APIs have been developed for research applications, such
as Chai3D, OpenHaptics, and the Open Source H3DAPI.
Medicine
Haptic interfaces for medical simulation may prove especially useful for training in
minimally invasive procedures such as laparoscopy and interventional radiology, as well as
for performing remote surgery. A particular advantage of this type of work is that
surgeons can perform more operations of a similar type with less fatigue. It is well
documented that a surgeon who performs more procedures of a given kind will have
statistically better outcomes for his patients. Haptic interfaces are also used in
rehabilitation robotics.
In ophthalmology,haptic refers to supporting springs, two of which hold an artificial lens
within the lens capsule after the surgical removal ofcataracts.
A Virtual Haptic Back (VHB) was successfully integrated in the curriculum at the Ohio
University College ofOsteopathic Medicine. Research indicates that VHB is a significant
teaching aid in palpatory diagnosis (detection of medical problems via touch). The VHB
simulates the contour and stiffness ofhuman backs, which are palpated with two haptic
interfaces (SensAble Technologies, PHANToM 3.0). Haptics have also been applied in the
field ofprosthetics and orthotics. Research has been underway to provide essential
feedback from a prosthetic limb to its wearer. Several research projects through the US
Department of Education and National Institutes of Health focused on this area. Recent
http://en.wikipedia.org/wiki/Haptic_technology#cite_note-12http://en.wikipedia.org/wiki/Haptic_technology#cite_note-12http://en.wikipedia.org/wiki/Haptic_technology#cite_note-12http://en.wikipedia.org/wiki/Nexus_Onehttp://en.wikipedia.org/wiki/Haptic_technology#cite_note-13http://en.wikipedia.org/wiki/Haptic_technology#cite_note-13http://en.wikipedia.org/wiki/Haptic_technology#cite_note-13http://en.wikipedia.org/wiki/Virtual_realityhttp://en.wikipedia.org/wiki/University_of_Tokyohttp://en.wikipedia.org/wiki/SIGGRAPH#Specific_SIGGRAPH_Conferenceshttp://en.wikipedia.org/wiki/New_Orleanshttp://en.wikipedia.org/w/index.php?title=Taction&action=edit&redlink=1http://en.wikipedia.org/wiki/APIhttp://en.wikipedia.org/wiki/Laparoscopyhttp://en.wikipedia.org/wiki/Interventional_radiologyhttp://en.wikipedia.org/wiki/Remote_surgeryhttp://en.wikipedia.org/wiki/Physical_therapyhttp://en.wikipedia.org/wiki/Roboticshttp://en.wikipedia.org/wiki/Ophthalmologyhttp://en.wikipedia.org/wiki/Lens_%28anatomy%29http://en.wikipedia.org/wiki/Cataracthttp://en.wikipedia.org/wiki/Ohio_Universityhttp://en.wikipedia.org/wiki/Ohio_Universityhttp://en.wikipedia.org/wiki/Osteopathic_Medicinehttp://en.wikipedia.org/wiki/Stiffnesshttp://en.wikipedia.org/wiki/Human_backhttp://en.wikipedia.org/wiki/Prostheticshttp://en.wikipedia.org/wiki/Orthoticshttp://en.wikipedia.org/wiki/US_Department_of_Educationhttp://en.wikipedia.org/wiki/US_Department_of_Educationhttp://en.wikipedia.org/wiki/National_Institutes_of_Healthhttp://en.wikipedia.org/wiki/National_Institutes_of_Healthhttp://en.wikipedia.org/wiki/US_Department_of_Educationhttp://en.wikipedia.org/wiki/US_Department_of_Educationhttp://en.wikipedia.org/wiki/Orthoticshttp://en.wikipedia.org/wiki/Prostheticshttp://en.wikipedia.org/wiki/Human_backhttp://en.wikipedia.org/wiki/Stiffnesshttp://en.wikipedia.org/wiki/Osteopathic_Medicinehttp://en.wikipedia.org/wiki/Ohio_Universityhttp://en.wikipedia.org/wiki/Ohio_Universityhttp://en.wikipedia.org/wiki/Cataracthttp://en.wikipedia.org/wiki/Lens_%28anatomy%29http://en.wikipedia.org/wiki/Ophthalmologyhttp://en.wikipedia.org/wiki/Roboticshttp://en.wikipedia.org/wiki/Physical_therapyhttp://en.wikipedia.org/wiki/Remote_surgeryhttp://en.wikipedia.org/wiki/Interventional_radiologyhttp://en.wikipedia.org/wiki/Laparoscopyhttp://en.wikipedia.org/wiki/APIhttp://en.wikipedia.org/w/index.php?title=Taction&action=edit&redlink=1http://en.wikipedia.org/wiki/New_Orleanshttp://en.wikipedia.org/wiki/SIGGRAPH#Specific_SIGGRAPH_Conferenceshttp://en.wikipedia.org/wiki/University_of_Tokyohttp://en.wikipedia.org/wiki/Virtual_realityhttp://en.wikipedia.org/wiki/Haptic_technology#cite_note-13http://en.wikipedia.org/wiki/Nexus_Onehttp://en.wikipedia.org/wiki/Haptic_technology#cite_note-12 -
8/2/2019 Self Study Haptics
12/18
work by Edward Colgate, Pravin Chaubey, and Allison Okamura et al. focused on
investigating fundamental issues and determining effectiveness for rehabilitation.
Robotics
The Shadow Hand uses the sense of touch, pressure, and position to reproduce thestrength, delicacy, and complexity of the human grip. The SDRH was developed by Richard
Greenhill and his team of engineers in London as part of The Shadow Project, now known
as the Shadow Robot Company, an ongoing research and development program whose
goal is to complete the first convincing artificial humanoid. An early prototype can be seen
in NASA's collection of humanoid robots, or robonauts.[19]
The Shadow Hand has haptic
sensors embedded in every joint and finger pad, which relay information to a central
computer for processing and analysis. Carnegie Mellon University in Pennsylvania and
Bielefeld University in Germany found The Shadow Hand to be an invaluable tool in
advancing the understanding of haptic awareness, and in 2006 they were involved in
related research.The first PHANTOM, which allows one to interact with objects in virtualreality through touch, was developed by Thomas Massie while a student of Ken Salisbury
at MIT.
Arts and design
Touching is not limited to feeling, but allows interactivity in real-time with virtual objects.
Thus, haptics are used in virtual arts, such as sound synthesis or graphic design and
animation. The haptic device allows the artist to have direct contact with a virtual
instrument that produces real-time sound or images. For instance, the simulation of a
violin string produces real-time vibrations of this string under the pressure and
expressiveness of the bow (haptic device) held by the artist. This can be done withphysical modelling synthesis.Designers and modellers may use high-degree-of-freedom
input devices that give touch feedback relating to the "surface" they are sculpting or
creating, allowing faster and more natural workflow than traditional methods.Artists
working with haptic technology such as vibrotactile effectors are Christa Sommerer,
Laurent Mignonneau, and Stahl Stenslie.
http://en.wikipedia.org/wiki/Shadow_Handhttp://en.wikipedia.org/w/index.php?title=Shadow_Robot_Company&action=edit&redlink=1http://en.wikipedia.org/wiki/Humanoidhttp://en.wikipedia.org/wiki/NASAhttp://en.wikipedia.org/wiki/Robonauthttp://en.wikipedia.org/wiki/Haptic_technology#cite_note-18http://en.wikipedia.org/wiki/Haptic_technology#cite_note-18http://en.wikipedia.org/wiki/Haptic_technology#cite_note-18http://en.wikipedia.org/wiki/Carnegie_Mellon_Universityhttp://en.wikipedia.org/wiki/Bielefeld_Universityhttp://en.wikipedia.org/wiki/MIThttp://en.wikipedia.org/wiki/Interactivityhttp://en.wikipedia.org/wiki/Sound_synthesishttp://en.wikipedia.org/wiki/Graphic_designhttp://en.wikipedia.org/wiki/Animationhttp://en.wikipedia.org/wiki/Physical_modelling_synthesishttp://en.wikipedia.org/wiki/Input_devicehttp://en.wikipedia.org/wiki/Stahl_Stensliehttp://en.wikipedia.org/wiki/Stahl_Stensliehttp://en.wikipedia.org/wiki/Input_devicehttp://en.wikipedia.org/wiki/Physical_modelling_synthesishttp://en.wikipedia.org/wiki/Animationhttp://en.wikipedia.org/wiki/Graphic_designhttp://en.wikipedia.org/wiki/Sound_synthesishttp://en.wikipedia.org/wiki/Interactivityhttp://en.wikipedia.org/wiki/MIThttp://en.wikipedia.org/wiki/Bielefeld_Universityhttp://en.wikipedia.org/wiki/Carnegie_Mellon_Universityhttp://en.wikipedia.org/wiki/Haptic_technology#cite_note-18http://en.wikipedia.org/wiki/Robonauthttp://en.wikipedia.org/wiki/NASAhttp://en.wikipedia.org/wiki/Humanoidhttp://en.wikipedia.org/w/index.php?title=Shadow_Robot_Company&action=edit&redlink=1http://en.wikipedia.org/wiki/Shadow_Hand -
8/2/2019 Self Study Haptics
13/18
Importance Of Haptics
In video games, the addition of haptic capabilities is nice to have. It increases the reality of
the game and, as a result, the user's satisfaction. But in training and other applications,haptic interfaces are vital. That's because the sense of touch conveys rich and detailed
information about an object. When it's combined with other senses, especially sight,
touch dramatically increases the amount of information that is sent to the brain for
processing. The increase in information reduces user error, as well as the time it takes to
complete a task. It also reduces the energy consumption and the magnitudes of contact
forces used in a teleoperation situation.
Clearly, Samsung is hoping to capitalize on some of these benefits with the introduction of
the Anycall Haptic phone. Nokia will push the envelope even farther when it introduces
phones with tactile touchscreens. Yes, such phones will be cool to look at. And, yes, theywill be cool to touch. But they will also be easier to use, with the touch-based features
leading to fewer input errors and an overall more satisfying experience.
Advantages
Ivan Sutherland, a founding father of virtual reality, suggested that the human
kinesthetic sense is as yet another independent channel to the brain, a channel whose
information is assimilated quite subconsciously This and other statements led
researchers to develop haptic interfaces. By adding an independent input channel, the
amount of information that is processed by the brain is increased. The increase in
information reduces the error and time taken to complete a task. It also reduces the
energy consumption and the magnitudes of contact forces used in a tele-operation
situation.Humans use their hands in exploring environments that have poor or no
visibility.
For instance, divers in murky water use their haptic senses in substitution for their visual
senses with little loss in performance. Humans are very good at identifying three-
dimensional objects placed in their hands, but are not as able to identify two-dimensional
-
8/2/2019 Self Study Haptics
14/18
objects. Although not as adept at searching across a two-dimensional space, humans have
particular ways of exploring such spaces. In two-dimensional exploration, such as
exploring raised surfaces on a plane, humans use a set of exploratory procedures as
observed by Lederman, Klatzky and Balakrishnan. Their research describes how humans
gather information about a two-dimensional surface. This usually happens by firstidentifying an edge and then following a contour.
Haptic displays alone are nearly useless, but when they are used in conjunction with a
visual display, they can become more useful than a stereoscopic display or a display with
multiple viewpoints. Batter and Brooks did an experiment to prove that the haptic
interfaces actually affected how well a user could learn from the system. They tested
several sections of a physics class that were learning electrostatic fields. The experimental
part of the class was allowed to use a force feedback device in a laboratory exercise, and
the control group was not. The experimental group did better than the control groups
because of their access to a haptic display in their lab work.
Disadvantages
Despite the progress made in the past two with human users.Although there is hardware
such as PHANTOM out in the commercial market, it is still very expensive for the home
users or business users to purchase the PHANTOM device.At the same time the software
compatibility is also another issue why haptic is still not very common yet.decades, haptic
interfaces have not yet become commonplace.One main reason is the technological
challenge associated with the design and production of interfaces that make physicalcontact .
Future
Haptic interfaces will continue to be centered on the hands because people gather
information from their surroundings in a haptic manner with their hands more than any
other body part. The search for an inexpensive, portable and useful haptic display will be
long and difficult, but it will continue for many years to come. Many researchers look for a
'natural' interface, but since there is a physical barrier between the human sensory-motor
capabilities and the electronic world of the computer, there will not be a natural system
until they can use direct neural stimulation of the brain. Instead, some suggest the search
should be for an intuitive system. Robert Stone emphasized this point by stating:
An intuitive interface between man and machine is one which requires little training and
proffers a working style most like that used by the human being to interact with
environments and objects in his day-to-day life. In other words, the human interacts with
-
8/2/2019 Self Study Haptics
15/18
elements of his task by looking, holding, manipulating, speaking, listening, and moving,
using as many of his natural skills as are appropriate, or can reasonably be expected to be
applied to a task. (Stone, 1995)
When creating a haptic interface, it is important to keep in mind what the device is going
to be used for. If it is not going to be used in a way that is intuitive to an operator, it may
cause problems even though the user is trained in its use. In times of stress, excitement,
or fatigue, people forget much of their training and do what comes intuitively. So if a
haptic interface is not being used in an intuitive manner, the operator may misuse the
interface by doing something that is natural to them.The study of haptics holds a key to
unlocking interface problems with the computer. Haptics enable a fairly intuitive way for
the human user to get information into the computer, and for the computer to display
information from a virtual world. Research in this area can help enable those who have
been unable to use a computer to its fullest extent overcome a physical limitation, and it
can enable users to explore objects and places that have been inaccessible under normal
circumstances.
Touch the virtual
Future applications
Future applications of haptic technology cover a wide spectrum of human interaction with
technology. Current[research focuses on the mastery of tactile interaction with holograms
and distant objects, which if successful may result in applications and advancements in
gaming, movies, manufacturing, medical, and other industries.The medical industry standsto gain from virtual and telepresence surgeries, which provide new options for medical
care. The clothing retail industry could gain from haptic technology by allowing users to
"feel" the texture of clothes for sale on the internet. Future advancements in haptic
technology may create new industries that were previously not feasible or realistic.
Holographic interaction
Researchers at the University of Tokyo are working on adding haptic feedback to
holographic projections. The feedback allows the user to interact with a hologram and
http://en.wikipedia.org/wiki/Holographyhttp://en.wikipedia.org/wiki/Holography -
8/2/2019 Self Study Haptics
16/18
receive tactile responses as if the holographic object were real. The research uses
ultrasound waves to create acoustic radiation pressure, which provides tactile feedback as
users interact with the holographic object. The haptic technology does not affect the
hologram, or the interaction with it, only the tactile response that the user perceives. The
researchers posted a video displaying what they call the Airborne Ultrasound Tactile
Display.As of 2008 the technology was not ready for mass production or mainstreamapplication in industry, but was quickly progressing, and industrial companies showed a
positive response to the technology. This example of possible future application is the first
in which the user does not have to be outfitted with a special glove or use a special
controlthey can "just walk up and use [it]".
Future medical applications
One currently developing medical innovation is a central workstation used by surgeons to
perform operations remotely. Local nursing staff set up the machine and prepare the
patient, and rather than travel to an operating room, the surgeon becomes atelepresence. This allows expert surgeons to operate from across the country, increasing
availability of expert medical care. Haptic technology provides tactile and resistance
feedback to surgeons as they operate the robotic device. As the surgeon makes an
incision, they feel ligaments as if working directly on the patient.
As of 2003, researchers at Stanford University were developing technology to simulate
surgery for training purposes. Simulated operations allow surgeons and surgical students
to practice and train more. Haptic technology aids in the simulation by creating a realistic
environment of touch. Much like telepresence surgery, surgeons feel simulated ligaments,
or the pressure of a virtual incision as if it were real. The researchers, led by J. Kenneth
Salisbury Jr., professor of computer science and surgery, hope to be able to create realisticinternal organs for the simulated surgeries, but Salisbury stated that the task will be
difficult.
Conclusion
With the increase of the computation power of office and home PC one-day haptic mouse
and interface device will be a common item in the house.The haptic development helps
the sense impaired people to experience a new way of operating computers in a way they
never do.From the examples mentioned in the commercial applications we can see that
the use of haptic is actually infinite.Haptic could be used in medical, training, e-commerce,
or even games.These are just some rough uses for haptic, the possibility is so much more
and right now haptic is helping NASA to explore planets in the Solar system by controlling
the robots.Though the technology is still quite unfamiliar to the general public but once
haptic is introduced I believe it will grow like the Internet back in the late 1980s early
1990s.Haptic is the future for online computing and e-commerce it will enhance the
shopper experience and help online shopper to feel the merchandise without leave their
home.Thus I believe Internet is the way of the future and haptic will make Internet be the
future.
http://en.wikipedia.org/wiki/Ultrasoundhttp://en.wikipedia.org/wiki/Acoustic_radiation_pressurehttp://en.wikipedia.org/wiki/Telepresencehttp://en.wikipedia.org/wiki/Stanford_Universityhttp://en.wikipedia.org/wiki/Stanford_Universityhttp://en.wikipedia.org/wiki/Telepresencehttp://en.wikipedia.org/wiki/Acoustic_radiation_pressurehttp://en.wikipedia.org/wiki/Ultrasound -
8/2/2019 Self Study Haptics
17/18
References
http://electronics.howstuffworks.com/everyday-tech/haptic-technology.htm
http://en.wikipedia.org/wiki/Haptic_technology http://misnt.indstate.edu/harper/Students/Haptics/mis476final.htm http://powerpointpresentationon.blogspot.in/2010/07/powerpoint-
presentation-on-haptics.html
http://www.google.co.in/ Klein. D, D. Rensink, H. Freimuth, G. J. Monkman, S. Egersdrfer, H. Bse &
M. Baumann. Modelling the Response of a Tactile Array using an
Electrorheological Fluids.Journal of Physics D: Applied Physics, vol 37, no. 5,pp794803, 2004.
Klein. D, H. Freimuth, G. J. Monkman, S. Egersdrfer, A. Meier, H. Bse M.Baumann, H. Ermert & O. T. Bruhns. Electrorheological Tactile Elements.
Mechatronics Vol 15, No 7, pp883897. Pergamon, September 2005.
Monkman. G. J. An Electrorheological Tactile Display. Presence (Journal ofTeleoperators and Virtual Environments) Vol. 1, issue 2, pp. 219228, MIT
Press, July 1992. f human - computer interaction providing a
virtual environment th one can explore through direct interaction
with our senses.
Its only an imitation of the real world.
http://electronics.howstuffworks.com/everyday-tech/haptic-technology.htmhttp://electronics.howstuffworks.com/everyday-tech/haptic-technology.htmhttp://electronics.howstuffworks.com/everyday-tech/haptic-technology.htmhttp://electronics.howstuffworks.com/everyday-tech/haptic-technology.htmhttp://misnt.indstate.edu/harper/Students/Haptics/mis476final.htmhttp://powerpointpresentationon.blogspot.in/2010/07/powerpoint-presentation-on-haptics.htmlhttp://powerpointpresentationon.blogspot.in/2010/07/powerpoint-presentation-on-haptics.htmlhttp://powerpointpresentationon.blogspot.in/2010/07/powerpoint-presentation-on-haptics.htmlhttp://powerpointpresentationon.blogspot.in/2010/07/powerpoint-presentation-on-haptics.htmlhttp://powerpointpresentationon.blogspot.in/2010/07/powerpoint-presentation-on-haptics.htmlhttp://powerpointpresentationon.blogspot.in/2010/07/powerpoint-presentation-on-haptics.htmlhttp://misnt.indstate.edu/harper/Students/Haptics/mis476final.htmhttp://electronics.howstuffworks.com/everyday-tech/haptic-technology.htmhttp://electronics.howstuffworks.com/everyday-tech/haptic-technology.htm -
8/2/2019 Self Study Haptics
18/18