preliminary development of palmsight: letting the …...preliminary development of palmsight:...

1
Preliminary Development of PalmSight: Letting the Visually Impaired See using a Hand-Held Device Alexandra Delazio 1 , Zhixuan Yu 2 , Samantha Horvath, PhD 3 , Jihang Wang 1 , John Galeotti, PhD 1,2,3 , Roberta Klatzky, PhD 4 , George Stetten, PhD 1,2,3 1 Department of Bioengineering, University of Pittsburgh, 2 Department of Bioengineering, Carnegie Mellon University, 3 Robotics Institute, Carnegie Mellon University, 4 Department of Psychology, Carnegie Mellon University There are approximately 20.6 million adults in the United States reporting significant vision loss [1] who are left to navigate the world reliant on white canes, service dogs and most importantly, their sense of touch. If those with visual impairment were given the ability to more accurately identify objects in their surroundings at a distance, they could complete everyday tasks more effectively. INTRODUCTION FINGERSIGHT OBJECTIVE: The objective of this research was to develop PalmSight, a device that imitates vision through cameras on the user’s palm and provides vibrotactile feedback that directs the user to an identified object until it is in their grasp (Figures 1A & 1B). Figure 1. A: Sketch of the PalmSight’s vibrotactile feedback system; B: Sketch of PalmSight’s palm camera The predecessor to PalmSight, known as FingerSight, was a device that allowed blind users to navigate their environment and identify surrounding objects using line detection. FingerSight consisted of a small black and white camera (SuperCircuits PC206XP 510 X 492) and a tactor which used speakers (GUI, Inc. #GC0251K-ND, 1W, 8Ω) with inputted low- frequency audio signals to generate vibration which were built into a finger clip that the users could wear on their pointer fingers (Figure 2). The camera and tactors were programmed using computer imaging software and a series of spatiotemporal edge detection algorithms [2]. Unlike FingerSight which detected lines with a finger camera, PalmSight was initially developed to detect orange colored objects using a palm camera. Figure 2. Final FingerSight Prototype MATERIALS & METHODS In order to develop a working prototype of PalmSight, device development was split into three main systems: object identification, vibrotactile feedback, and communication between these two systems. OBJECT IDENTIFICATION SYSTEM ORANGE COLOR DETECTION: To allow the palm camera to see the color orange, OpenCV, an open source computer vision library, was used. Images produced from the camera were split into red, green and blue (RGB) color channels and the distance between each individual colored pixel and a specific RGB value for orange was calculated (Figures 3A & 3B). Binary thresholding and morphological operations of erosion and dialtion were used on the distance images to produce a segmented image (Figure 3C & 3D). Figure 3. A: The image outputted from the palm camera depicting an orange screw driver handle; B: The outputted distance image, depicting the handle of the screw driver as darker than the rest of the image. C: Binary thresholded image, depicting a the orange screw driver handle as white with some background noise; D: Morphological Operations applied to the final image to remove outliers. Once the orange colored object blobs were detectable by the palm camera, the largest orange blob in the image was identified by counting the number of black and white pixels displayed on the image screen. If the number of white pixels was larger than a certain value, it was considered an object of interest. The pixel coordinates of the centroid (center of mass) of the largest orange blob were calculated and tracked through nine equally sized regions on the screen. The position of the orange object relative to these regions was displayed to the user. ORANGE OBJECT TRACKING: Figure 4. The display screen is split into nine regions: top (T), top-right (T+R), top-left (T+L), right (R), center (star), left (L), bottom (B), bottom-left (B+L) and bottom-right (B+R) and the orange blob is being identified in the top (T) region. VIBROTACTILE FEEDBACK SYSTEM ITERATION 1: The tactors used in the precursor to PalmSight, consisted of small speakers (GUI, Inc. #GC0251K-ND, 1W, 8Ω) a piece of hard rubber, s grommet, and eraser tip cemented to the center of the speaker and would directly transmit the speakers’ vibrations onto the users’ finger (Figure 5). TACTOR DEVELOPMENT: Figure 5. Picture and SolidWorks model of tactors used in FingerSight. ITERATION 2: The tactors for PalmSight were the same as those for FingerSight with speakers (GUI, Inc. #6C0301K 8Ω 0.5W) heat shrink encompassed the tactor to ensure wire insulation and tight junctions between tactor components. These tactors were driven by 210 Hz sinusoidal waves gated by 3.0 Hz square waves (Figure 6). Though these tactors provided excellent vibrotactile feedback, when all five tactors were placed in formation on the hand their vibrations were undistinguishable due to their size and closeness. Figure 6. Initial PalmSight Tactors. ITERATION 3: Tactors with smaller grommets and speakers (GUI, Inc. #GC0251KX 8Ω 1W) were created to allow for better tactor placement on the hand, however they did not produce a distinct, noticeable vibration without a larger grommet. Figure 7. Comparison between small and large speakers and tactors. ITERATION 4: The speaker tactor design was replaced by Adafruit Vibrating Mini Motor Discs (ADA1201) driven by a 5V input gated by a 3.0 Hz square wave. These new tactors, when stuck on the hand using removable adhesive dots, provided distinguishable, isolated vibrotactile feedback to the user. Picture of disks on hand Figure 8. Final PalmSight Tactors. To determine the best tactor positioning on the back of the hand, four different combinations of four tactors and five tactors (Figure 8) will be placed on subjects hands. Subjects will be asked to distinguish between the four or five tactors given distinct vibrations to the individual tactors. Depending on the accuracy of the subjects’ tactor identification a tactor number and position will be chosen. TACTOR ORIENTATION AND PLACEMENT: Figure 8. A: four tactor position guide; B: Five Tactor position guide ; C: Tactor position guide placed on hand prior to testing and black dots mark the position for tactor placement; D: Each tactor was labeled with an arbitrary letter and corresponding key. Subjects are asked to press the key corresponding to the tactor vibration that they feel. COMMUNICATION BETWEEN SYSTEMS A Wixel microprocessor provided the interface between the vibrotactile system and the computer vision software running on a Dell computer (Latitude E5450 2.2GHz), allowing for the vibrating tactors to correspond to the location of the orange object on the display. The final set-up, shown in Figure 9, allowed the vibrotactile feedback to be driven by a gated 3.0 Hz square wave oscillator or a gated manual buttons for debugging and testing purposes. Figure 9. Final Schematic and Circuitry for final PalmSight prototype Though initially, a ?? camera (Figure 1B) was used to test the PalmSight system, the newest device prototypes use two 180 degree Fisheye Lens 1080p Wide Angle Pc Web USB cameras (ELP) were mounted on the palm of the users hand using a specially designed acrylic and Velcro hand mount (Figure 10). These two cameras were made to acquire stereo vision and allow the user to experience depth perception and as well as wider scope to scan the surrounding environment. Figure 10. Palm cameras RESULTS & DISCUSSION To date, the orange identification system has been fully developed and identifies orange objects from the surroundings, outputting a binary image with minimal background noise. The program can determine which of the nine regions of the screen contains the orange object and use this location to activate the corresponding tactor(s) on the hand’s back. CONCLUSION The new PalmSight device shows promise in assisting the visually impaired individual to find and grasp objects. After further developing of the orange identification and vibrotactile feedback systems, we will test our initial hand-held prototype on non-visually- impaired, blindfolded volunteers to verify its effectiveness and usability. ACKNOWLEDGMENTS NIH 1R01EY021641, NSF GRFP 0946825, and Research to Prevent Blindness. REFERENCES [1] D.L. Blackwell, J.W. Lucas and T.C. Clarke. Summary health statistics for U.S. adults: National Health Interview Survey, 2012. National Center for Health Statistics. Vital Health Stat 10(260). 2014. [2] S. Horvath, J. Galeotti, B. Wu, R. Klatzky, M. Siegel, G. Stetten. FingerSight: Fingertip Haptic Sensing of the Visual Environment, IEEE Journal of Translational Engineering in Health and Medicine, March, 2014.

Upload: others

Post on 22-May-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Preliminary Development of PalmSight: Letting the …...Preliminary Development of PalmSight: Letting the Visually Impaired See using a Hand-Held Device Alexandra Delazio 1, Zhixuan

PreliminaryDevelopmentofPalmSight:LettingtheVisuallyImpairedSeeusingaHand-HeldDevice

AlexandraDelazio1,Zhixuan Yu2,SamanthaHorvath,PhD3,JihangWang1,JohnGaleotti,PhD1,2,3,RobertaKlatzky,PhD4,GeorgeStetten,PhD1,2,3

1DepartmentofBioengineering,UniversityofPittsburgh,2DepartmentofBioengineering,CarnegieMellonUniversity,3RoboticsInstitute,CarnegieMellonUniversity,4DepartmentofPsychology,CarnegieMellonUniversity

There are approximately 20.6 million adults in the United Statesreporting significant vision loss [1] who are left to navigate the worldreliant on white canes, service dogs and most importantly, their sense oftouch. If those with visual impairment were given the ability to moreaccurately identify objects in their surroundings at a distance, they couldcomplete everyday tasks more effectively.

INTRODUCTION

FINGERSIGHT

OBJECTIVE:The objective of this research was to develop PalmSight, a device thatimitates vision through cameras on the user’s palm and providesvibrotactile feedback that directs the user to an identified object untilit is in their grasp (Figures 1A & 1B).

Figure1.A:SketchofthePalmSight’s vibrotactilefeedbacksystem;B:SketchofPalmSight’s palmcamera

The predecessor to PalmSight, known as FingerSight, was adevice that allowed blind users to navigate their environmentand identify surrounding objects using line detection.FingerSight consisted of a small black and white camera(SuperCircuits PC206XP 510 X 492) and a tactor which usedspeakers (GUI, Inc. #GC0251K-ND, 1W, 8Ω) with inputted low-

frequency audio signals to generate vibration which were built into a finger clip that theusers could wear on their pointer fingers (Figure 2). The camera and tactors wereprogrammed using computer imaging software and a series of spatiotemporal edgedetection algorithms [2]. Unlike FingerSight which detected lines with a finger camera,PalmSight was initially developed to detect orange colored objects using a palm camera.

Figure 2. Final FingerSight Prototype

MATERIALS&METHODSIn order to develop a working prototype of PalmSight, device development was split intothree main systems: object identification, vibrotactile feedback, and communicationbetween these two systems.

OBJECTIDENTIFICATIONSYSTEMORANGECOLORDETECTION:

To allow the palm camera to see the color orange, OpenCV, an open source computervision library, was used. Images produced from the camera were split into red, green andblue (RGB) color channels and the distance between each individual colored pixel and aspecific RGB value for orange was calculated (Figures 3A & 3B). Binary thresholding andmorphological operations of erosion and dialtion were used on the distance images toproduce a segmented image (Figure 3C & 3D).

Figure 3. A: The image outputted from the palmcamera depicting an orange screwdriver handle; B: The outputted distance image, depicting the handle of thescrew driver as darker than the rest of the image. C: Binary thresholded image, depicting a the orange screw driver handle as white with some backgroundnoise; D: Morphological Operations applied to the final image to removeoutliers.

Once the orange colored object blobs weredetectable by the palm camera, the largest orangeblob in the image was identified by counting thenumber of black and white pixels displayed on theimage screen. If the number of white pixels was largerthan a certain value, it was considered an object ofinterest. The pixel coordinates of the centroid (centerof mass) of the largest orange blob were calculatedand tracked through nine equally sized regions on thescreen. The position of the orange object relative tothese regions was displayed to the user.

ORANGEOBJECTTRACKING:

Figure 4. The display screen is split into nine regions: top (T),top-right (T+R), top-left (T+L), right (R), center (star), left (L),bottom(B), bottom-left (B+L) andbottom-right (B+R) and theorangeblob is being identified in the top (T) region.

VIBROTACTILEFEEDBACKSYSTEM

ITERATION 1: The tactors used in the precursor to PalmSight,consisted of small speakers (GUI, Inc. #GC0251K-ND, 1W, 8Ω) apiece of hard rubber, s grommet, and eraser tip cemented to thecenter of the speaker and would directly transmit the speakers’vibrations onto the users’ finger (Figure 5).

TACTORDEVELOPMENT:

Figure 5. Picture and SolidWorks model oftactors used in FingerSight.

ITERATION 2: The tactors for PalmSight were the same as those forFingerSight with speakers (GUI, Inc. #6C0301K 8Ω 0.5W) heat shrinkencompassed the tactor to ensure wire insulation and tightjunctions between tactor components. These tactors were driven by210 Hz sinusoidal waves gated by 3.0 Hz square waves (Figure 6).

Though these tactors provided excellent vibrotactile feedback,when all five tactors were placed in formation on the hand theirvibrations were undistinguishable due to their size and closeness.Figure 6. Initial PalmSight Tactors.

ITERATION 3: Tactors with smaller grommets and speakers (GUI,Inc. #GC0251KX 8Ω 1W) were created to allow for better tactorplacement on the hand, however they did not produce a distinct,noticeable vibration without a larger grommet.

Figure7.Comparisonbetweensmallandlargespeakersandtactors.

ITERATION 4: The speaker tactor design was replaced byAdafruit Vibrating Mini Motor Discs (ADA1201) driven by a 5Vinput gated by a 3.0 Hz square wave. These new tactors, whenstuck on the hand using removable adhesive dots, provideddistinguishable, isolated vibrotactile feedback to the user.

Pictureofdisksonhand

Figure 8. Final PalmSight Tactors.

To determine the best tactor positioning on the back of the hand, four differentcombinations of four tactors and five tactors (Figure 8) will be placed on subjects hands.Subjects will be asked to distinguish between the four or five tactors given distinctvibrations to the individual tactors. Depending on the accuracy of the subjects’ tactoridentification a tactor number and position will be chosen.

TACTORORIENTATIONANDPLACEMENT:

Figure 8. A: four tactor positionguide; B: Five Tactor position guide; C: Tactor position guide placed onhand prior to testing and black dotsmark the position for tactorplacement; D: Each tactor waslabeled with an arbitrary letter andcorresponding key. Subjects areasked to press the keycorresponding to the tactorvibration that they feel.

COMMUNICATIONBETWEENSYSTEMSA Wixel microprocessor provided the interface between the vibrotactile system and thecomputer vision software running on a Dell computer (Latitude E5450 2.2GHz), allowingfor the vibrating tactors to correspond to the location of the orange object on the display.The final set-up, shown in Figure 9, allowed the vibrotactile feedback to be driven by agated 3.0 Hz square wave oscillator or a gated manual buttons for debugging and testingpurposes.

Figure 9. Final Schematic andCircuitry for final PalmSight prototype

Though initially, a ?? camera (Figure 1B) was used to test the PalmSightsystem, the newest device prototypes use two 180 degree Fisheye Lens1080p Wide Angle Pc Web USB cameras (ELP) were mounted on the palmof the users hand using a specially designed acrylic and Velcro hand mount(Figure 10). These two cameras were made to acquire stereo vision andallow the user to experience depth perception and as well as wider scopeto scan the surrounding environment. Figure 10. Palmcameras

RESULTS&DISCUSSIONTo date, the orange identification system has been fully developed and identifies orangeobjects from the surroundings, outputting a binary image with minimal background noise.The program can determine which of the nine regions of the screen contains the orangeobject and use this location to activate the corresponding tactor(s) on the hand’s back.

CONCLUSIONThe new PalmSight device shows promise in assisting the visually impaired individual tofind and grasp objects. After further developing of the orange identification andvibrotactile feedback systems, we will test our initial hand-held prototype on non-visually-impaired, blindfolded volunteers to verify its effectiveness and usability.

ACKNOWLEDGMENTSNIH1R01EY021641,NSFGRFP0946825,andResearchtoPreventBlindness.

REFERENCES[1]D.L.Blackwell,J.W.LucasandT.C.Clarke.SummaryhealthstatisticsforU.S.adults:NationalHealthInterviewSurvey,2012.NationalCenterforHealthStatistics.VitalHealthStat10(260).2014.[2]S.Horvath,J.Galeotti,B.Wu,R.Klatzky,M.Siegel,G.Stetten.FingerSight:FingertipHapticSensingoftheVisualEnvironment,IEEEJournalofTranslationalEngineeringinHealthandMedicine,March,2014.