the-eye.eu · table of contents learning robotics using python credits about the author about the...
TRANSCRIPT
![Page 1: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/1.jpg)
![Page 2: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/2.jpg)
LearningRoboticsUsingPython
![Page 3: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/3.jpg)
TableofContents
LearningRoboticsUsingPythonCreditsAbouttheAuthorAbouttheReviewerswww.PacktPub.com
Supportfiles,eBooks,discountoffers,andmoreWhysubscribe?FreeaccessforPacktaccountholders
PrefaceWhatthisbookcoversWhatyouneedforthisbookWhothisbookisforConventionsReaderfeedbackCustomersupport
DownloadingtheexamplecodeDownloadingthecolorimagesofthisbookErrataPiracyQuestions
1.IntroductiontoRoboticsWhatisarobot?
HistoryofthetermrobotModerndefinitionofarobot
Wheredorobotscomefrom?Whatcanwefindinarobot?
ThephysicalbodySensorsEffectorsControllers
Howdowebuildarobot?ReactivecontrolHierarchical(deliberative)controlHybridcontrol
Summary2.MechanicalDesignofaServiceRobot
TheRequirementsofaservicerobotRobotdrivemechanism
SelectionofmotorsandwheelsCalculationofRPMofmotorsCalculationofmotortorque
![Page 4: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/4.jpg)
ThedesignsummaryRobotchassisdesign
InstallingLibreCAD,Blender,andMeshLabInstallingLibreCADInstallingBlenderInstallingMeshLab
Creatinga2DCADdrawingoftherobotusingLibreCADThebaseplatedesignBaseplatepoledesignWheel,motor,andmotorclampdesignCasterwheeldesignMiddleplatedesignTopplatedesign
Workingwitha3DmodeloftherobotusingBlenderPythonscriptinginBlenderIntroductiontoBlenderPythonAPIsPythonscriptoftherobotmodel
QuestionsSummary
3.WorkingwithRobotSimulationUsingROSandGazeboUnderstandingroboticsimulation
MathematicalmodelingoftherobotIntroductiontothedifferentialsteeringsystemandrobotkinematicsExplainingoftheforwardkinematicsequationInversekinematics
IntroductiontoROSandGazeboROSConcepts
TheROSfilesystemTheROSComputationGraphTheROScommunitylevel
InstallingROSIndigoonUbuntu14.04.2IntroducingcatkinCreatinganROSpackageHello_world_publisher.pyHello_world_subscriber.pyIntroducingGazeboInstallingGazeboTestingGazebowiththeROSinterfaceInstallingTurtleBotRobotpackagesonROSIndigoInstallingTurtleBotROSpackagesusingtheaptpackagemanagerinUbuntuSimulatingTurtleBotusingGazeboandROSCreatingtheGazebomodelfromTurtleBotpackagesWhatisarobotmodel,URDF,xacro,androbotstatepublisher?CreatingaChefBotdescriptionROSpackage
chefbot_base_gazebo.urdf.xacro
![Page 5: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/5.jpg)
kinect.urdf.xacrochefbot_base.urdf.xacro
SimulatingChefBotandTurtleBotinahotelenvironmentQuestionsSummary
4.DesigningChefBotHardwareSpecificationsoftheChefBothardwareBlockdiagramoftherobot
MotorandencoderSelectingmotors,encoders,andwheelsfortherobot
MotordriverSelectingamotordriver/controller
InputpinsOutputpinsPowersupplypins
EmbeddedcontrollerboardUltrasonicsensors
SelectingtheultrasonicsensorInertialMeasurementUnitKinectCentralProcessingUnitSpeakers/micPowersupply/battery
WorkingoftheChefBothardwareQuestionsSummary
5.WorkingwithRoboticActuatorsandWheelEncodersInterfacingDCgearedmotorwithTivaCLaunchPad
DifferentialwheeledrobotInstallingtheEnergiaIDEInterfacingcode
InterfacingquadratureencoderwithTivaCLaunchpadProcessingencoderdataQuadratureencoderinterfacingcode
WorkingwithDynamixelactuatorsQuestionsSummary
6.WorkingwithRoboticSensorsWorkingwithultrasonicdistancesensors
InterfacingHC-SR04toTivaCLaunchPadWorkingofHC-SR04InterfacingcodeofTivaCLaunchPadInterfacingTivaCLaunchPadwithPython
WorkingwiththeIRproximitysensorWorkingwithInertialMeasurementUnit
![Page 6: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/6.jpg)
InertialNavigationInterfacingMPU6050withTivaCLaunchPad
SettinguptheMPU6050libraryinEnergiaInterfacingcodeofEnergia
InterfacingMPU6050toLaunchpadwiththeDMPsupportusingEnergiaQuestionsSummary
7.ProgrammingVisionSensorsUsingPythonandROSListofroboticvisionsensorsandimageprocessinglibrariesIntroductiontoOpenCV,OpenNI,andPCL
WhatisOpenCV?InstallationofOpenCVfromsourcecodeinUbuntu14.04.2ReadinganddisplayinganimageusingthePython-OpenCVinterfaceCapturingfromwebcamera
WhatisOpenNIInstallingOpenNIinUbuntu14.04.2
WhatisPCL?ProgrammingKinectwithPythonusingROS,OpenCV,andOpenNI
HowtolaunchOpenNIdriverTheROSinterfaceofOpenCV
CreatingROSpackagewithOpenCVsupportDisplayingKinectimagesusingPython,ROS,andcv_bridge
WorkingwithPointCloudsusingKinect,ROS,OpenNI,andPCLOpeningdeviceandPointCloudgeneration
ConversionofPointCloudtolaserscandataWorkingwithSLAMusingROSandKinectQuestionsSummary
8.WorkingwithSpeechRecognitionandSynthesisUsingPythonandROSUnderstandingspeechrecognition
BlockdiagramofaspeechrecognitionsystemSpeechrecognitionlibraries
CMUSphinx/PocketSphinxJulius
WindowsSpeechSDKSpeechsynthesisSpeechsynthesislibraries
eSpeakFestival
WorkingwithspeechrecognitionandsynthesisinUbuntu14.04.2usingPythonSettingupPocketSphinxanditsPythonbindinginUbuntu14.04.2WorkingwithPocketSphinxPythonbindinginUbuntu14.04.2Output
Real-timespeechrecognitionusingPocketSphinx,GStreamer,andPythoninUbuntu14.04.2SpeechrecognitionusingJuliusandPythoninUbuntu14.04.2
![Page 7: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/7.jpg)
InstallationofJuliusspeechrecognizerandPythonmodulePython-JuliusclientcodeImprovingspeechrecognitionaccuracyinPocketSphinxandJuliusSettingupeSpeakandFestivalinUbuntu14.04.2
WorkingwithspeechrecognitionandsynthesisinWindowsusingPythonInstallationoftheSpeechSDK
WorkingwithSpeechrecognitioninROSIndigoandPythonInstallationofthepocketsphinxpackageinROSIndigo
WorkingwithspeechsynthesisinROSIndigoandPythonQuestionsSummary
9.ApplyingArtificialIntelligencetoChefBotUsingPythonBlockdiagramofthecommunicationsysteminChefBotIntroductiontoAIML
IntroductiontoAIMLtagsIntroductiontoPyAIML
InstallingPyAIMLonUbuntu14.04.2InstallingPyAIMLfromsourcecode
WorkingwithAIMLandPythonLoadingasingleAIMLfilefromthecommand-lineargument
WorkingwithA.L.I.C.E.AIMLfilesLoadingAIMLfilesintomemoryLoadingAIMLfilesandsavingtheminbrainfilesLoadingAIMLandbrainfilesusingtheBootstrapmethod
IntegratingPyAIMLintoROSaiml_server.pyaiml_client.pyaiml_tts_client.pyaiml_speech_recog_client.pystart_chat.launchstart_tts_chat.launchstart_speech_chat.launch
QuestionsSummary
10.IntegrationofChefBotHardwareandInterfacingitintoROS,UsingPythonBuildingChefBothardwareConfiguringChefBotPCandsettingChefBotROSpackagesInterfacingChefBotsensorswithTivaCLaunchPad
EmbeddedcodeforChefBotWritingaROSPythondriverforChefBotUnderstandingChefBotROSlaunchfilesWorkingwithChefBotPythonnodesandlaunchfiles
WorkingwithSLAMonROStobuildthemapoftheroomWorkingwithROSlocalizationandnavigation
Questions
![Page 8: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/8.jpg)
Summary11.DesigningaGUIforaRobotUsingQtandPython
InstallingQtonUbuntu14.04.2LTSWorkingwithPythonbindingsofQt
PyQtInstallingPyQtonUbuntu14.04.2LTS
PySideInstallingPySideonUbuntu14.04.2LTS
WorkingwithPyQtandPySideIntroducingQtDesignerQtsignalsandslotsConvertingaUIfileintoPythoncodeAddingaslotdefinitiontoPyQtcodeUpandrunningofHelloWorldGUIapplication
WorkingwithChefBot'scontrolGUIInstallingandworkingwithrqtinUbuntu14.04.2LTS
QuestionsSummary
12.TheCalibrationandTestingofChefBotTheCalibrationofXboxKinectusingROS
CalibratingtheKinectRGBcameraCalibratingtheKinectIRcamera
WheelodometrycalibrationErroranalysisofwheelodometryErrorcorrection
CalibratingtheMPU6050TestingoftherobotusingGUI
ProsandconsoftheROSnavigationQuestionsSummary
Index
![Page 9: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/9.jpg)
LearningRoboticsUsingPython
![Page 10: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/10.jpg)
LearningRoboticsUsingPythonCopyright©2015PacktPublishing
Allrightsreserved.Nopartofthisbookmaybereproduced,storedinaretrievalsystem,ortransmittedinanyformorbyanymeans,withoutthepriorwrittenpermissionofthepublisher,exceptinthecaseofbriefquotationsembeddedincriticalarticlesorreviews.
Everyefforthasbeenmadeinthepreparationofthisbooktoensuretheaccuracyoftheinformationpresented.However,theinformationcontainedinthisbookissoldwithoutwarranty,eitherexpressorimplied.NeithertheauthornorPacktPublishing,anditsdealersanddistributorswillbeheldliableforanydamagescausedorallegedtobecauseddirectlyorindirectlybythisbook.
PacktPublishinghasendeavoredtoprovidetrademarkinformationaboutallofthecompaniesandproductsmentionedinthisbookbytheappropriateuseofcapitals.However,PacktPublishingcannotguaranteetheaccuracyofthisinformation.
Firstpublished:May2015
Productionreference:1220515
PublishedbyPacktPublishingLtd.
LiveryPlace
35LiveryStreet
BirminghamB32PB,UK.
ISBN978-1-78328-753-6
www.packtpub.com
CoverimagebyJarekBlaminsky(<[email protected]>)
![Page 11: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/11.jpg)
CreditsAuthor
LentinJoseph
Reviewers
AvkashChauhan
VladimirIakovlev
BlagojPetrushev
MarekSuppa
CommissioningEditor
RebeccaYoué
AcquisitionEditor
RebeccaYoué
ContentDevelopmentEditor
AthiraLaji
TechnicalEditors
AnkurGhiye
ManaliGonsalves
CopyEditors
PranjaliChury
RelinHedly
MerilynPereira
AdithiShetty
ProjectCoordinator
HarshalVed
![Page 12: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/12.jpg)
Proofreaders
StephenCopestake
SafisEditing
Indexer
PriyaSane
Graphics
SheetalAute
ProductionCoordinator
NiteshThakur
CoverWork
NiteshThakur
![Page 13: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/13.jpg)
AbouttheAuthorLentinJosephisanelectronicsengineer,roboticsenthusiast,machinevisionexpert,embeddedprogrammer,andthefounderandCEOofQboticsLabs(http://www.qboticslabs.com)inIndia.Hegothisbachelor'sdegreeinelectronicsandcommunicationengineeringattheFederalInstituteofScienceandTechnology(FISAT),Kerala.Inhisfinalyearengineeringproject,hecreatedasocialrobot,whichcaninteractwithpeople.Theprojectwasahugesuccessandgotmentionedinvisualandprintmedia.Themainfeatureofthisrobotwasthatitcouldcommunicatewithpeopleandreplyintelligently.Italsohassomeimage-processingcapabilities,suchasface,motion,andcolordetection.TheentireprojectwasimplementedusingthePythonprogramminglanguage.Hisinterestinrobotics,imageprocessing,andPythonbeganthisproject.
Aftergraduation,heworkedatastart-upcompanybasedonroboticsandimageprocessingfor3years.Inthemeantime,helearnedfamousroboticsoftwareplatforms—suchasRobotOperatingsystem(ROS),V-REP,andActin(aroboticsimulationtool)—andimageprocessinglibraries,suchasOpenCV,OpenNI,andPCL.Healsoknowsaboutrobot3Ddesigning,embeddedprogrammingonArduino,andStellarisLaunchpad.
After3yearsofworkexperience,hestartedanewcompanycalledQboticsLabs,whichismainlyfocusedonresearchtobuildgreatproductsindomainssuchaswearabletechnology,robotics,machinevision,greentechnology,andonlineeducation.Hemaintainsapersonalwebsite(http://www.lentinjoseph.com)andatechnologyblogcalledtechnolabsz(http://www.technolabsz.com).Hepublisheshisworksonhistechblog.HewasaspeakeratPyCon2013India,andhespokeonthetopicoflearningroboticsusingPython.
Iwouldliketodedicatethisbooktomyparentsbecausetheygavemetheinspirationtowriteit.Iwouldalsoliketoconveymyregardstomyfriendswhohelpedandinspiredmetowritethisbook.
IwouldliketothankMarekSuppaforhisvaluablecontributioninwritingChapter1,IntroductiontoRobotics,inadditiontoreviewingthisbook.
![Page 14: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/14.jpg)
AbouttheReviewersAvkashChauhaniscurrentlyleadingateamofengineersatastart-upbasedinSanFrancisco,wherehisteamisbuildingabigdatamonitoringplatformusingmachinelearningandnewagemethodstoimprovebusinesscontinuityandgainmaximumadvantagefromtheplatformitself.HeisthefounderandprincipalofBigDataPerspective,withavisiontomaketheHadoopplatformaccessibletomainstreamenterprisesbysimplifyingitsadoption,customization,management,andsupport.BeforeBigDataPerspective,heworkedatPlatforaInc.,buildingbigdataanalyticssoftwarerunningnativelyonHadoop.Previously,heworkedfor8yearsatMicrosoft,buildingcloudandbigdataproductsandprovidingassistancetoenterprisepartnersworldwide.Avkashhasover15yearsofsoftwaredevelopmentexperienceincloudandbigdatadisciplines.Heisaprogrammeratheartinfull-stackdisciplineandhasthebusinessacumentoworkwithenterprises,meetingtheirneeds.Heispassionateabouttechnologyandenjoyssharinghisknowledgewithothersthroughvarioussocialmedia.Hehasalsowrittenafewbooksonbigdatadisciplineandisveryactiveinthetechsocialspace.Heisanaccomplishedauthor,blogger,technicalspeaker,andhelovestheoutdoors.
VladimirIakovlevisasoftwaredeveloper.Mostofthetime,hedevelopswebapplicationsusingPython,Clojure,andJavaScript.He'stheownerofafewsemi-popularopensourceprojects.HewasaspeakeratafewPython-relatedconferences.
Inhisfreetime,Vladimirlikestoplaywithelectronicdevices,suchasArduinoandPyBoard,andimage-processingdevices,suchasLeapMotion.Hehastriedtobuildsomerobots.Hehasalreadybuiltaroboticarm.
Currently,VladimirworksatUpwork,wherehedevelopswebapplications,mostlywithPython.
BlagojPetrushevisasoftwareengineerandconsultantbasedinSkopje,Macedonia.Hisworkrevolvesmainlyaroundbackends,datastores,andnetworkapplications.Amonghisinterestsaremachinelearning,NLP,dataanalysis,modelinganddatabases,anddistributedprogramming.
MarekSuppahasbeenplayingwith(kindof)smartmachinesforthepastfewyears,whicharepretentiouslycalledrobotsinsomepartsoftheworld.Rightnow,heleadsaroboticfootballteam,buildingtoolstohelpothersstartwithrobotsandsettingoffonanewventuretoseehowfarthecurrenttechnologywillletusmovetowardthegoalofcreatingarobotasitwasfirstdefined.
Iwouldliketothankeveryonewhosupportedthecreationofthisbook,whoeverandwherevertheymightbe.
![Page 15: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/15.jpg)
www.PacktPub.comSupportfiles,eBooks,discountoffers,andmoreForsupportfilesanddownloadsrelatedtoyourbook,pleasevisitwww.PacktPub.com.
DidyouknowthatPacktofferseBookversionsofeverybookpublished,withPDFandePubfilesavailable?YoucanupgradetotheeBookversionatwww.PacktPub.comandasaprintbookcustomer,youareentitledtoadiscountontheeBookcopy.Getintouchwithusat<[email protected]>formoredetails.
Atwww.PacktPub.com,youcanalsoreadacollectionoffreetechnicalarticles,signupforarangeoffreenewslettersandreceiveexclusivediscountsandoffersonPacktbooksandeBooks.
https://www2.packtpub.com/books/subscription/packtlib
DoyouneedinstantsolutionstoyourITquestions?PacktLibisPackt'sonlinedigitalbooklibrary.Here,youcansearch,access,andreadPackt'sentirelibraryofbooks.
Whysubscribe?FullysearchableacrosseverybookpublishedbyPacktCopyandpaste,print,andbookmarkcontentOndemandandaccessibleviaawebbrowser
FreeaccessforPacktaccountholdersIfyouhaveanaccountwithPacktatwww.PacktPub.com,youcanusethistoaccessPacktLibtodayandview9entirelyfreebooks.Simplyuseyourlogincredentialsforimmediateaccess.
![Page 16: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/16.jpg)
PrefaceLearningRoboticswithPythoncontainstwelvechaptersthatmainlyaimsathowtobuildanautonomousmobilerobotfromscratchandhowtoprogramitusingPython.Therobotmentionedinthisbookisaservicerobot,whichcanbeusedtoservefoodathome,hotels,andrestaurants.Fromthebeginningtoend,thisbookdiscussesthestep-by-stepprocedureonhowtobuildthisrobot.Thebookstartswiththebasicconceptsofroboticsandthenmovesontothe3Dmodelingandsimulationoftherobot.Afterthesuccessfulsimulationoftherobot,itdiscussesthehardwarecomponentsrequiredtobuildtherobotprototypeinordertocompletetherobotnavigation.
ThesoftwarepartofthisrobotismainlyimplementedusingthePythonprogramminglanguageandsoftwareframeworks,suchasRobotOperatingSystem(ROS),Open-CV,andsoon.YouwillunderstandtheapplicationofPythonfromtheaspectsofdesigningtherobottotherobot’suserinterface.TheGazebosimulatorisusedtosimulatetherobotandmachinevisionlibraries,suchasOpen-CVandOpenNI.PCLisusedtoprocessthe2Dand3Dimagedataoftherobot.Eachchapterispresentedwithanadequatetheorytounderstandtheapplicationaspect.Thebookisreviewedbyexpertsinthisfieldwhoarepassionateaboutrobotics.
![Page 17: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/17.jpg)
WhatthisbookcoversChapter1,IntroductiontoRobotics,containsbasicconceptsandterminologiesofrobotics.Thischapterisamustforbeginnerswhoarejuststartingwithrobotics.
Chapter2,MechanicalDesignofaServiceRobot,discussesthe2Dand3DCADdesigningaspectoftherobotusingLibreCADandBlender(freesoftware).ThischapteralsodemonstrateshowtouseBlenderPythonAPIsinordertobuildthe3Dmodel.
Chapter3,WorkingwithRobotSimulationUsingROSandGazebo,takesyouthroughthesimulationoftheservicerobotusingGazeboandROS.
Chapter4,DesigningChefBotHardware,explainsthehardwaredesigningoftherobot,includingblockdiagramandhardwarecomponentsrequiredtobuildChefBot.
Chapter5,WorkingwithRoboticActuatorsandWheelEncoders,coversinterfacingofroboticactuatorsandwheelencodersusingTivaCLaunchPad.Italsomentionshigh-endsmartactuatorslikedynamixel.
Chapter6,WorkingwithRoboticSensors,discussesinterfacingofultrasonicdistancesensors,IRproximitysensors,andIMUusingTivaCLaunchPad.
Chapter7,ProgrammingVisionSensorsUsingPythonandROS,talksabouttheintroductiontoOpen-CV,OpenNI,andPCLlibrariesandinterfacingthesetoROSandprogrammingusingPython.
Chapter8,WorkingwithSpeechRecognitionandSynthesisUsingPythonandROS,discussesspeechrecognitionandsynthesisusingvariouslibrariesandinterfacingittoROSprogrammingusingPython.
Chapter9,ApplyingArtificialIntelligencetoChefBotUsingPython,coverstutorialstobuildaChatterBot.Thiscanbeusedtomaketherobotinteractive.
Chapter10,IntegrationofChefBotHardwareandInterfacingitintoROS,UsingPython,explorestutorialstointegratethecompletehardwareandessentialsoftwaresection.ItmainlydiscussesautonomousnavigationoftheservicerobotandhowtoprogramitusingROSandPython.
Chapter11,DesigningaGUIforaRobotUsingQtandPython,coverstutorialsonhowtobuildaGUIfortheuserwhooperatestherobotinatypicalrestaurant.TheGUIisbuiltusingQtandthePyQtPythonwrapper.
Chapter12,TheCalibrationandTestingofChefBot,explorestutorialsonhowtocalibrateandtesttherobotforthefinalrun.
![Page 18: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/18.jpg)
WhatyouneedforthisbookThebookisallabouthowtobuildarobot.Tostartwiththisbook,youshouldhavesomehardware.Therobotcanbebuiltfromscratch,oryoucanbuyadifferential-driveconfigurationrobotwithanencoderfeedback.Youshouldbuyacontrollerboard,suchasTexasInstrumentsLaunchpad,forembeddedprocessing.Youshouldhaveatleastalaptop/netbookfortheentirerobotprocess.Inthisbook,wewilluseIntelNUCforrobotprocessing.It’sverycompactinsizeanddelivershighperformance.Forthe3Dvision,youshouldhave3Dsensors,suchaslaserscanner,Kinect,andAsusXtionPro.
Inthesoftwaresection,youshouldhaveagoodunderstandingonhowtoworkwithGNU/Linuxcommands.YoushouldalsohaveagoodknowledgeofPython.YoushouldinstallUbuntu14.04.2LTStoworkwiththeexamples.IfyouhaveknowledgeaboutROS,OpenCV,OpenNI,andPCL,itwillbeagreatadd-on.YouhavetoinstallROSIndigototesttheseexamples.
![Page 19: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/19.jpg)
WhothisbookisforLearningRoboticswithPythonisagoodcompanionforentrepreneurswhowanttoexploretheserviceroboticsdomain,professionalswhowanttoimplementmorefeaturestotheirrobots,researcherswhowanttoexploremoreaboutrobotics,andhobbyistorstudentswhowanttolearnrobotics.Thebookfollowsastep-by-stepguidethatcanbeeasilyunderstoodbyanyone.
![Page 20: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/20.jpg)
ConventionsInthisbook,youwillfindanumberofstylesoftextthatdistinguishbetweendifferentkindsofinformation.Herearesomeexamplesofthesestyles,andanexplanationoftheirmeaning.
Codewordsintext,databasetablenames,foldernames,filenames,fileextensions,pathnames,dummyURLs,userinput,andTwitterhandlesareshownasfollows:“Thefirstprocedureistocreateaworldfileandsaveitwiththe.worldfileextension.”
Ablockofcodeissetasfollows:
<xacro:includefilename=”$(find
chefbot_description)/urdf/chefbot_gazebo.urdf.xacro”/>
<xacro:includefilename=”$(find
chefbot_description)/urdf/chefbot_properties.urdf.xacro”/>
Anycommand-lineinputoroutputiswrittenasfollows:
$roslaunchchefbot_gazebochefbot_empty_world.launch
Newtermsandimportantwordsareshowninbold.Wordsthatyouseeonthescreen,inmenusordialogboxesforexample,appearinthetextlikethis:“wecancommandtherobottonavigatetosomepositiononthemapusingthe2DNavGoalbutton”.
Note
Warningsorimportantnotesappearinaboxlikethis.
Tip
Tipsandtricksappearlikethis.
![Page 21: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/21.jpg)
ReaderfeedbackFeedbackfromourreadersisalwayswelcome.Letusknowwhatyouthinkaboutthisbook—whatyoulikedormayhavedisliked.Readerfeedbackisimportantforustodeveloptitlesthatyoureallygetthemostoutof.
Tosendusgeneralfeedback,simplysendane-mailto<[email protected]>,andmentionthebooktitleviathesubjectofyourmessage.
Ifthereisatopicthatyouhaveexpertiseinandyouareinterestedineitherwritingorcontributingtoabook,seeourauthorguideonwww.packtpub.com/authors.
![Page 22: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/22.jpg)
CustomersupportNowthatyouaretheproudownerofaPacktbook,wehaveanumberofthingstohelpyoutogetthemostfromyourpurchase.
DownloadingtheexamplecodeYoucandownloadtheexamplecodefilesforallPacktbooksyouhavepurchasedfromyouraccountathttp://www.packtpub.com.Ifyoupurchasedthisbookelsewhere,youcanvisithttp://www.packtpub.com/supportandregistertohavethefilese-maileddirectlytoyou.
DownloadingthecolorimagesofthisbookWealsoprovideyouaPDFfilethathascolorimagesofthescreenshots/diagramsusedinthisbook.Thecolorimageswillhelpyoubetterunderstandthechangesintheoutput.Youcandownloadthisfilefrom:https://www.packtpub.com/sites/default/files/downloads/7536OS_ImageBundle.pdf.
ErrataAlthoughwehavetakeneverycaretoensuretheaccuracyofourcontent,mistakesdohappen.Ifyoufindamistakeinoneofourbooks—maybeamistakeinthetextorthecode—wewouldbegratefulifyouwouldreportthistous.Bydoingso,youcansaveotherreadersfromfrustrationandhelpusimprovesubsequentversionsofthisbook.Ifyoufindanyerrata,pleasereportthembyvisitinghttp://www.packtpub.com/submit-errata,selectingyourbook,clickingontheerratasubmissionformlink,andenteringthedetailsofyourerrata.Onceyourerrataareverified,yoursubmissionwillbeacceptedandtheerratawillbeuploadedonourwebsite,oraddedtoanylistofexistingerrata,undertheErratasectionofthattitle.Anyexistingerratacanbeviewedbyselectingyourtitlefromhttp://www.packtpub.com/support.
PiracyPiracyofcopyrightmaterialontheInternetisanongoingproblemacrossallmedia.AtPackt,wetaketheprotectionofourcopyrightandlicensesveryseriously.Ifyoucomeacrossanyillegalcopiesofourworks,inanyform,ontheInternet,pleaseprovideuswiththelocationaddressorwebsitenameimmediatelysothatwecanpursuearemedy.
Pleasecontactusat<[email protected]>withalinktothesuspectedpiratedmaterial.
Weappreciateyourhelpinprotectingourauthors,andourabilitytobringyouvaluablecontent.
QuestionsYoucancontactusat<[email protected]>ifyouarehavingaproblemwithanyaspectofthebook,andwewilldoourbesttoaddressit.
![Page 23: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/23.jpg)
Chapter1.IntroductiontoRoboticsIfyoureadanintroductorychapterinanytechnicalbook,youmayhavenoticedthatitprettymuchalwaysfollowsthesamestructure.Itbeginsbydescribinghowawesomethetopicis,whatagooddecisionitistostartreadingthebook,andhowyoushouldkeeponreadingbecausetherearemanyexcitingthingsawaitingyouinitsfurtherchapters.
Thischapterisnosuchchapter.Itstartswiththefollowingquote:
Roboticsisanart.
Although,suchastrongstatementdoesprobablydeservesomeexplanation,webelievethatafteryoufinishreadingthisbook(andbuildingyourownrobots!),nofurtherexplanationwillbeneeded.
Soifroboticsisanart,howdoesonelearnit?Toputitdifferently,whatarethedifferencesbetweenlearningtoplayamusicalinstrument,learningtopaint,learningtowrite,andlearningrobotics?Webelievethattherearenottoomanyofthem.Justasmusiciansneedtoplayontheirinstruments,paintersneedtoproducepaintings,andwritersneedtowritetheirtexts,roboticists(thetermweusetodescribepeoplewhobuildrobotics)needtobuildtheirrobots.Justasmusicians,painters,andwritersneedtolearnthejargonusedintheirtrades,roboticistsneedtofamiliarizethemselveswithafewbasictermsthattheymightrunintowhilereadingtutorials,researchingscientificliterature,andtalkingtootherroboticsenthusiasts.Also,justasanyartistneedstoknowatleastalittlebitaboutthehistoryoftheirrespectiveart,sodoesanygoodroboticistneedtoknowathingortwoaboutthehistoryofrobotics.That'swhyinthischapter,wewillcover:
Whatisarobot?Wheredorobotscomefrom?Whatcanwefindinarobot?Howdowebuildrobots?
Whatisarobot?Ratherthandefiningwhatarobotisrightaway,let'spauseforamomentanddiscusswhetherweneedtoansweraquestionlikethisafterall.Everybodyknowsthatarobotissomesortofamachinethatcanmovearoundanddependingonwhatmovieyousaworwhichbookyouread,itcaneitherhelphumansintheirday-to-daylifeormeantheendofhumanity.
It'sclearthatthereissomecontroversyandlotsofmisunderstandingsaboutrobotsandtheirroleinthepast,present,andthefuture.Inordertobetterunderstandthesituation,let'sfirstexaminecloselytheterm"robot"itself.Then,wewilltrytodefineitabitmoreformallytopreventanymisunderstandingorcontroversy.
HistoryofthetermrobotTheterm"robot"wasusedforthefirsttimebyKarelČapek,aCzechwriterinhisplayRossum'sUniversalRobots(R.U.R)thathewrotein1920,todenoteanartificialhumanmadeoutofsynthetic
![Page 24: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/24.jpg)
organicmatter.Theserobots(robotiinCzech)weremadeinfactoriesandtheirpurposewastoreplacehumanworkers.Whiletheywereveryefficientandexecutedorderstheyweregivenperfectly,theylackedanyemotion.Itseemedthathumanswouldnotneedtoworkatallbecauserobotsseemedtobehappytoworkforthem.Thischangedafterawhileandarobotrevoltresultedinextinctionofthehumanrace.
R.U.Risquitedarkanddisturbing,butitdoesnotleavethefuturehopeless.Itwasconsideredquiteasuccessbackinthedayandwecertainlydorecommendyoutoreadit.Asitscopyrighthadalreadyexpiredinmanycountriesatthetimeofwritingthisbook,itshouldnotbeaproblemtofindaversiononline,whichisinthepublicdomain.
"Whenhe(YoungRossum)tookalookathumananatomyhesawimmediatelythatitwastoocomplexandthatagoodengineercouldsimplifyit.Soheundertooktoredesignanatomy,experimentingwithwhatwouldlenditselftoomissionorsimplification.Robotshaveaphenomenalmemory.Ifyouweretoreadthematwenty-volumeencyclopediatheycouldrepeatthecontentsinorder,buttheyneverthinkupanythingoriginal.They'dmakefineuniversityprofessors."
--KarelCapek,R.U.R.(Rossum'sUniversalRobots),1920
WhilemanyattributethetermrobottoKarelČapekashewrotetheplayinwhichitappearedforthefirsttime,therearesourcessuggestingthatitwasactuallyČapek'sbrotherJosefwhocameupwiththeterm(itseemsthattherewasanarticleinCzechdailyprintwrittenbyKarelČapekhimself,inwhichhewantstosettherecordstraightbytellingthisstory).Karelwantedtousethetermlaboři(fromLatinlabor,work),buthedidnotlikeit.Itseemedtooartificialtohim,soheaskedhisbrotherforadvice.JosefsuggestedrobotiandthatwaswhatKarelusedintheend.
Nowthatweknowwhenthetermrobotwasusedforthefirsttimeandwhoactuallycreatedit,let'sfindoutwheredoesitcomefrom.TheexplanationthatmanyuseisthatitcomesfromtheCzech
![Page 25: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/25.jpg)
wordsrobotaandrobotník,whichliterallymeans"work"and"worker"respectively.However,thewordrobotaalsomeans"work"or"serflabor"inSlovak.Also,weshouldtakeintoaccountthatsomesourcessuggestthatbythetimeKarelwaswritingR.U.R,heandhisbrotheroftenvisitedhisfatherinasmallSlovakspatowncalledTrenčianskeTeplice.Therefore,itmightverywellbethatthetermrobotwasinspiredbytheusageoftheword"robota"inSlovaklanguage,whichiscoincidentally,thenativelanguageofoneoftheauthorsofthisbook.
WhetherthetermrobotcomesfromCzechorSlovak,thewordrobotamightbeamatterofnationalpride,butitdoesnotconcernustoomuch.Inbothcases,theliteralmeaningis"work","labor",or"hardwork"anditwasthepurposeoftheČapek'srobots.However,robotshaveevolveddramaticallyoverthepasthundredyears.Tosaythattheyareallaboutdoinghardworkwouldprobablybeanunderstatement.
So,let'strytodefinethenotionofarobotasweperceiveittoday.
ModerndefinitionofarobotWhenwetrytofindaprecisedefinitionofsometerm,ourfirststopisusuallysomesortofencyclopediaoradictionary.Let'strytodothisforthetermrobot.
OurfirststopwillbeEncyclopediaBritannica.Itsdefinitionofarobotisasfollows:
"Anyautomaticallyoperatedmachinethatreplaceshumaneffort,thoughitmightnotresemblehumanbeingsinappearanceorpreformfunctionsinahumanlikemanner."
Thisisquiteanicedefinition,buttherearequiteafewproblemswithit.
Firstofall,it'sabittoobroad.Bythisdefinition,awashingmachineshouldalsobeconsideredarobot.Itdoesoperateautomatically(well,mostofthemdo),itdoesreplacehumaneffort(althoughnotbychangingthesametasksahumanwoulddo),anditcertainlydoesnotresembleahuman.
Secondly,it'squitedifficulttoimaginewhatarobotactuallyisafterreadingthisdefinition.Withsuchabroaddefinition,therearewaytoomanythingsthatcanbeconsideredarobotandthisdefinitiondoesnotprovideuswithanyspecificfeatures.
ItturnsoutthatwhileEncyclopediaBritannica'sdefinitionofarobotdoesnotfitourneedswellenough,it'sactuallyoneofthebestonesthatonecanfind.Forexample,TheFreeDictionarydefinesarobotas"Amechanicaldevicethatsometimesresemblesahumanandiscapableofperformingavarietyofoftencomplexhumantasksoncommandorbybeingprogrammedinadvance."Thisisevenworsethanwhatwehadanditseemsthatawashingmachineshouldstillbeconsideredarobot.
Theinherentproblemwiththesedefinitionsisthattheytrytocapturevastamountofmachinesthatwecallrobotsthesedays.Theresultisthatit'sverydifficult,ifnotimpossible,tocomeupwithadefinitionthatwillbecomprehensiveenoughandnotincludeawashingmachineatthesametime.JohnEngelberger,founderoftheworld'sfirstroboticscompanyandindustrialrobotics(asweknowittoday)oncefamouslysaid,"Ican'tdefinearobot,butIknowonewhenIseeone."
![Page 26: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/26.jpg)
So,isitevenpossibletodefinearobot?Maybenotingeneral.However,ifwelimitourselvesjusttothescopeofthisbook,theremaybeadefinitionthatwillsuitourneedswellenough.InherveryniceintroductorybookonthesubjectofroboticscalledTheRoboticsPrimer(whichwealsohighlyrecommend),MajaJ.Mataricusesthefollowingdefinition:
"Arobotisanautonomoussystemwhichexistsinthephysicalworld,cansenseitsenvironment,andcanactonittoachievesomegoals."
Atfirstsight,itmightnotseemlikeavastimprovementoverwhatwehavesofar,butlet'sdissectitpartbyparttoseewhetheritmeetsourneeds.
Thefirstpartsays,"Arobotisanautonomoussystem".Byautonomous,wemeanthatarobotmakesdecisionsonitsown—it'snotcontrolledbyahuman.Thisalreadyseemstobeanimprovementasitweedsoutanymachinethat'scontrolledbysomeone(suchasourfamouswashingmachine).Robotsthatwewilltalkaboutthroughoutthisbookmaysometimeshavesomesortofaremotefunction,whichallowsahumantocontrolitremotely,butthisfunctionalityisusuallybuilt-inassortofasafetymeasuresothatifsomethinggoeswrongandtherobot'sautonomoussystemsfailstobehaveaswewouldexpectthemto,it'sstillpossibletogettherobottosafetyanddiagnoseitsproblemsafterwards.However,themaingoalstillstaysthesame,thatis,tobuildrobotsthatcantakesomedirectionfromhumansandareabletoactandfunctionontheirown.
However,justbeinganautonomoussystemwillcertainlynotbeenoughforarobotinthisbook.Forinstance,wecanfindmanycomputerprogramsthatwecancallautonomoussystems(theyarenotcontrolledbyanindividualandmakedecisionsontheirown)andyetwedonotconsiderthemtoberobots.
Togetaroundthisobstacle,weneedtheotherpartofthesentencethatsays,"whichexistsinthephysicalworld".
Giventherecentadvancesinthefieldsofartificialintelligenceandmachinelearning,thereisnoshortageofcomputersystemsthatactontheirownandperformsomeworkforus,whichiswhatrobotsshouldbefor.Asaquitenotoriousexample,let'sconsiderspamfilters.Thesearecomputerprogramsthatreadeverye-mailthatreachesyoure-mailaddressanddecideswhetheryoumaywanttoreadit(andthatthee-mailisindeedlegitimate)orwhetherit'syetanotherexampleofanunwantede-mail.
Thereisnodoubtthatsuchasystemishelpful(ifyoudisagree,trytoreadsomeofthee-mailsinyourSpamfolder—Iamprettysureitwillbeaboringread).It'sestimatedthatover60percentofalle-mailtrafficin2014canbeattributedtospame-mails.Beingabletoautomaticallyfilterthemcansaveusalotofreadingtime.Also,asthereisanohumaninvolvedinthedecisionprocess(although,wecanhelpitbymarkingane-mailasspam),wecancallsuchasystemasautonomous.Still,wewillnotcallitatruerobot.Rather,wecallthem"softwarerobots"orjust"bots"(thefactthattheirnameisshortermaycomefromthefactthattheyareshortofthephysicalpartsoftruerobots).
Whilesoftwarerobotsaredefinitelyaninterestinggrouponitsown,it'sthephysicalworldinwhichrobotsoperatethatmakestheprocessofcreatingthemsoexcitinganddifficultatthesametime.
![Page 27: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/27.jpg)
Whencreatingasoftwarerobot,youcancountonthefactthattheenvironmentitwillrunin(usuallytheoperatingsystem)willbequitestable(asin,nottoomanythingsmaychangeunexpectedly).However,whenyouarecreatingarealrobot,youcanneverbesure.
Thisiswhyarealrobotneedstoknowwhatishappeningintheenvironmentinwhichitoperates.Also,thisiswhythenextpartofthedefinitionsays,"cansenseitsenvironment".
Sensingwhatishappeningaroundarealrobotisarguablyitsmostimportantfeature.Tosensetheirsurroundingenvironments,robotsusuallyhavesensors.Thesearedevicesthatmeasurephysicalcharacteristicsoftheenvironmentandprovidethisinformationbacktotherobotsothatitcan,forinstance,reacttosuddenchangesoftemperature,humidity,orpressure.Thisisquiteabigdifferencefromsoftwarerobots.Whiletheyjustgettheinformationtheyneedinordertooperatesomewhatmagically,realrobotsneedtohaveasubsystemorsubsystemsthattakecareofobtainingthisinformation.Ifwelookatthedifferencesbetweenrobotsandhumans,wewillnotfindmany(inourveryhigh-levelview,ofcourse).Wecanthinkofsensoringsubsystemsasartificialreplacementsforhumanorgansthatprovidethissortofinformationtothebrain.
Oneimportantconsequenceofthisdefinitionisthatanythingthatdoesnotsenseitsenvironmentcannotbecalledarobot.Thisincludesanydevicesthatjust"driveblind"ormoveinarandomfashionbecausetheydonothaveanyinformationfromtheenvironmenttobasetheirbehavioron.
Anyroboticistwilltellyouthatrobotsareveryexcitingmachines.Manywillalsoarguethatwhatmakesthemsoexcitingisactuallytheirabilitytointeractwiththeoutsideworld(whichistomoveorotherwisechangetheenvironmenttheyarein).Withoutthis,theyarejustanotherstaticmachinethatmightbeuseful,butratherunexciting.
Ourdefinitionofarobotreflectsthisinitslastpartwhenitsays,"canactonittoachievesomegoals".
Actingontheenvironmentmightsoundlikeaverycomplextaskforarobot,butinthiscase,itjustmeanschangingtheworldinsome(evenveryslight)way.Wecallthesepartsofrobotsthatperformthisaseffectors.Ifwelookatourrobotvshumancomparison,effectorsaretheartificialequivalentsofhands,legs,andotherbodypartsthatallowittomove.Effectorsmakeuseofsomelower-levelsystemssuchasmotorsormusclesthatactuallycarryoutthemovement.Wecallthemactuators.Although,theartificialonesmayseemtofunctionsimilartothebiologicalones,acloserlookwillrevealthattheyareactuallyquitedifferent.
Youmayhavenoticedthatthispartisnotonlyaboutactingontherobot'senvironment,butalsoaboutachievingsomegoals.Whilemanyhobbyroboticistsbuildrobotsjustforthefunofit,mostrobotsarebuiltinordertocarryout(or,shouldwerathersay,tohelpwith)sometasks,suchasmovingheavypartsinafactoryorlocatingvictimsinareasaffectedbynaturaldisasters.
Aswesaidbefore,asystemoramachinethatbehavesrandomlyanddoesnotuseinformationfromitsenvironmentcannotreallybeconsideredarobot.However,howcanitusetheseinformationsomehow?Theeasiestthingtodoistodosomethinguseful,whichwecanrephraseastryingtoreachsomegoalthatweconsideruseful,whichinturnbringsusbacktoourdefinition.Agoalofarobot
![Page 28: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/28.jpg)
doesnotnecessarilyneedtobesomethingascomplexandambitiousas"hardlaborforhuman".Itcaneasilybesomethingsimple,suchas"donotbumpintoobstacles"or"turnthelightswitchon".
Now,aswehaveatleastaslightideaofwhatarobotis,wecanmoveontobrieflydiscusswhererobotscomefrom,inotherwords,thehistoryofrobotics.
![Page 29: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/29.jpg)
Wheredorobotscomefrom?Asthetitlesuggests,thispartofthechaptershouldbeaboutthehistoryofrobots.Wealreadyknowafewquiteimportantfacts,suchasthetermrobotwascoinedbyaCzechauthorKarelČapekin1920.Asitturnsout,therearemanymoreinterestingeventsthathappenedovertheyears,otherthanthisone.Inordertokeepthingsorganized,let'sstartfromthebeginning.
It'squitedifficulttopinpointaprecisedateinhistory,whichwecanmarkasthedateofbirthofthefirstrobot.Forone,wehaveestablishedquitearestrictivedefinitionofarobotpreviously;thus,wewillhavetowaituntilthe20thcenturytoactuallyseearobotinthepropersenseoftheword.Untilthen,let'satleastdiscussthehonorablementions.
Thefirstonethatcomesclosetoarobotisamechanicalbirdcalled"ThePigeon".ThiswaspostulatedbyaGreekmathematicianArchytasofTarentuminthe4thcenturyBCandwassupposedtobepropelledbysteam.Itcannotnotbeconsideredarobotbyourdefinition(notbeingabletosenseitsenvironmentalreadydisqualifiesit),butitcomesprettycloseforitsage.Overthefollowingcenturies,thereweremanyattemptstocreateautomaticmachines,suchasclocksmeasuringtimeusingtheflowofwater,life-sizedmechanicalfigures,orevenfirstprogrammablehumanoidrobots(itwasactuallyaboatwithfourautomaticmusiciansonit).Theproblemwithalltheseisthattheyareverydisputableasthereisverylittle(ornone)historicallytrustworthyinformationavailableaboutthesemachines.
ItwouldhavestayedlikethisforquitesometimeifitwasnotforLeonardoDaVinci'snotebooksthatwererediscoveredin1950s.Theycontainacompletedrawingofa1945humanoid(afancywordforamechanicaldevicethatresemblehumans),whichlookslikeanarmoredknight.Itseemsthatitwasdesignedsothatitcouldsitup,waveitsarms,moveitshead,andmostimportantly,amuseroyalty.Inthe18thcentury,followingtheamusementline,JacquesdeVaucansoncreatedthreeautomata:afluteplayerthatcouldplaytwelvesongs,atambourineplayer,andthemostfamousone,"TheDigestingDuck".Thisduckwascapableofmoving,quacking,flappingwings,oreveneatinganddigestingfood(notinawayyouwillprobablythink—itjustreleasedmatterstoredinahiddencompartment).Itwasanexampleof"movinganatomy"—modelinghumanoranimalanatomyusingmechanics.
Ourlistwillnotbecompleteifweomittedtheserobot-likedevicesthatcameaboutinthefollowingcentury.Manyofthemwereradio-controlled,suchasNikolaTesla'sboat,whichheshowcasedatMadisonSquareGardeninNewYork.Youcouldcommandittogoforward,stop,turnleftorright,turnitslightsonoroff,andevensubmerge.Allofthisdidnotseemtooimpressiveatthattimebecausethepressreportsattributeditto"mindcontrol".
Atthispoint,wehaveonceagainreachedthetimewhenthetermrobotwasusedforthefirsttime.Aswesaidmanytimesbefore,itwasin1920whenKarelČapekuseditinhisplay,R.U.R.Twodecadeslater,anotherveryimportanttermwascoined.IssacAsimovusedthetermroboticsforthefirsttimeinhisstory"Runaround"in1942.Asimovwrotemanyotherstoriesaboutrobotsandisconsideredtobeaprominentsci-fiauthorofhistime.
However,intheworldofrobotics,heisknownforhisthreelawsofrobotics:
![Page 30: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/30.jpg)
Firstlaw:Arobotmaynotinjureahumanbeingorthroughinactionallowahumanbeingtocometoharm.SecondLaw:Arobotmustobeytheordersgiventoitbyhumanbeings,exceptwheresuchorderswouldconflictwiththefirstlaw.Thirdlaw:Arobotmustprotectitsownexistence,aslongassuchprotectiondoesnotconflictwiththefirstorsecondlaw.
Afterawhile,headdedazerothlaw:
Zerothlaw:Arobotmaynotharmhumanityorbyinactionallowhumanitytocometoharm.
Theselawssomehowreflectthefeelingspeoplehadaboutmachinestheycalledrobotsatthattime.Seeingenslavementbysomesortofintelligentmachineasarealpossibility,theselawsweresupposedtobesomesortofguidingprinciplesoneshouldatleastkeepinmind,ifnotdirectlyfollow,whendesigninganewintelligentmachine.Also,whilemanywereafraidoftherobotapocalypse,timehasshownthatit'sstillyettocome.Inorderforittotakeplace,machineswillneedtogetsomesortofintelligence,someabilitytothink,andactbasedontheirthoughts.Also,whilewecanseethatoverthecourseofhistory,themechanicalsideofrobotswentthroughsomedevelopment,theintelligencesimplywasnotthereyet.
Thiswaspartofthereasonwhyinthesummerof1956,agroupofverywisegentlemen(whichincludedMarvinMinsky,JohnMcCarthy,HerbertSimon,andAllanNewell)werelatercalledtobethefoundingfathersofthenewlyfoundedfieldofArtificialIntelligence.Itwasatthisveryeventwheretheygottogethertodiscusscreatingintelligenceinmachines(thus,thetermartificialintelligence).
Although,theirgoalswereveryambitious(somesourcesevenmentionthattheirideawastobuildthiswholemachineintelligenceduringthatsummer),ittookquiteawhileuntilsomeinterestingresultscouldbepresented.
OnesuchexampleisShakey,arobotbuiltbytheStanfordResearchInstitute(SRI)in1966.Itwasthefirstrobot(inourmodernsenseoftheword)capabletoreasonitsownactions.Therobotsbuiltbeforethisusuallyhadalltheactionstheycouldexecutepreprogrammed.Ontheotherhand,Shakeywasabletoanalyzeamorecomplexcommandandsplititintosmallerproblemsonhisown.ThefollowingimageofShakeyistakenfromhttps://en.wikipedia.org/wiki/File:ShakeyLivesHere.jpg:
![Page 31: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/31.jpg)
Shakey,restingintheComputerHistoryMuseuminMountainView,California
Hishardwarewasquiteadvancedtoo.Hehadcollisiondetectors,sonarrangefinders,andatelevisioncamera.Heoperatedinasmallclosedenvironmentofrooms,whichwereusuallyfilledwithobstaclesofmanykinds.Inordertonavigatearoundtheseobstacles,itwasnecessarytofindawayaroundtheseobstacleswhilenotbumpingintosomething.Shakeydiditinaverystraightforwardway.
Atfirst,hecarefullyplannedhismovesaroundtheseobstaclesandslowly(thetechnologywasnotasadvancedbackthen)triedtomovearoundthem.Ofcourse,gettingfromastablepositiontomovementwouldn'tbepossiblewithoutsomeshakeymoves.TheproblemwasthatShakey'smovementsweremostlyofthisshakeynature,sohecouldnotbecalledanythingotherthanShakey.
ThelessonslearnedbytheresearcherswhoweretryingtoteachShakeyhowtonavigateinhisenvironmentturnedouttobeveryimportant.ItcomesasnosurprisethatoneoftheresultsoftheresearchonShakeyistheA*searchalgorithm(analgorithmthatcanveryefficientlyfindthebestpathbetweentwogoals).Thisisconsideredtobeoneofthemostfundamentalbuildingblocksnotonlyinthefieldofroboticsorartificialintelligence,butalsointhefieldofcomputerscienceasawhole.
Ourdiscussiononthehistoryofroboticscangoonandonforaverylongtime.Althoughonecandefinitelywriteabookonthistopic(asit'saveryinterestingone),it'snotthisbook;weshalltrytogetbacktothequestionwetriedtoanswer,whichwas:wheredorobotscomefrom?
Inanutshell,robotsevolvedfromtheverybasicmechanicalautomationthroughremotely-controlledobjectstodevicesorsystemsthatcanact(orevenadapt)ontheirowninordertoachievesomegoal.
![Page 32: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/32.jpg)
Ifthissoundswaytoocomplicated,donotworry.Thetruthisthattobuildyourownrobot,youdonotreallyneedtodeeplyunderstandanyofthis.Thevastmajorityofrobotsyouwillencounterarebuiltfromsimplepartsthatarenotdifficulttounderstandwhenyouseethebigpicture.
So,let'sfigureouthowwewillbuildourownrobot.Let'sfindoutwhataretherobotsmadeof.
![Page 33: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/33.jpg)
Whatcanwefindinarobot?Intheveryfirstpartofthischapter,wetriedtocomeupwithagood(modern)definitionofarobot.Itturnsoutthatthedefinitionwecameupwithdoesnotonlydescribearobotasweknowit(orwouldliketoknowit),butalsogivesussomegreatpointersastowhatpartscanwemostdefinitelyfindin(oron)arobot.Let'sseeourdefinitionagain:
"Arobotisanautonomoussystemwhichexistsinthephysicalworld,cansenseitsenvironment,andcanactonittoachievesomegoals."
So,whatwillthesemostimportantpartsbe?Hereiswhatwethinkshouldbeonthislist.
ThephysicalbodyItwillbehardforarobottoexistinthephysicalworldwithoutaphysicalbody.Whilethisobviouslyhasitsadvantages(havingarealworldrobotyoucanplaywithismuchmoreexcitingthanhavingacomputersimulation),thereisalsosomepricetobepaid.Forinstance,aphysicalrobotcanonlybeatoneplaceatatime,cannotreallychangeitsshape,anditsfunctionsarequitelimitedbyhowitsbodylooks.Asitsenvironmentwillbethephysicalworld,it'ssafetoassumethattherobotwillnotbetheonlyobjectinit.Thisalreadyposessomechallenges,suchasmakingsurethattherobotwon'trunintosomewall,object,human,orevenanotherrobot.Also,inordertodothis,therobotneedstobeable,asthedefinitionsays,tosenseitsenvironment.
SensorsWealreadydiscussedinquitesomedepthabouthowimportantarobot'ssensorsarebecausewithoutthem,hewouldbejustlost.Agoodquestiontoaskmightbe,"So,whatdoesarobotactuallysense?".Asinmanyotherplaces(inscienceandtechnology),itdependsonwhattherobot'spurposeandgoalinagivenenvironmentis,thedesignoftherobot,andtheamountofpoweritconsumes,andsoon.Agoodrobotdesignerandprogrammertriestotakeallthesedependenciesintoaccountsothatintheend,thefinalrobotcanhavetherightamountofinformationaboutitsenvironmenttofulfillitspurposeandreachitsgoals.
Oneimportantnotionwithregardstosensingisthatofastate.Astateofarobotbasicallymeansadescriptionofallitsparametersatanygiventime.Forinstance,ifweconsiderarobottohavesomesoundsensors(thankstowhichitcouldmeasurethenoiselevelinitsenvironment),butnowayoffiguringouthowmuchbatterypowerdoesithaveleft,wecancallitsstatepartially-observable.Ontheotherhand,ifithadasensorforeveryoutputoftherobotandeveryphysicalcharacteristicoftheenvironmenttherobotresidesin,wecancallsuchastatefullyobservable.
Nowthatweknowthestateoftherobotintheenvironment,ourrobotneedssomethingthatcanbeusedtoleavesomeeffectonitsenvironment.Somethinglikeaneffector.
Effectors
![Page 34: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/34.jpg)
Wealreadytouched(albeitbriefly)onthetopicofeffectorswhenweweretryingtodecipherpartsofourdefinitionofarobot,sowealreadyknowthateffectorslettherobotdophysicalthingsandthesmallsubpartsofthem,actuators,areactuallythosethatdotheheavylifting.
Whatwedidnotmentionwasthat,historically,therearetwomainactivitieseffectorscanhelpwith:locomotionandmanipulation.
Ingeneral,locomotionmeansmovingaround:goingfrompointAtopointB.Thisisofgreatinterestinasubfieldofrobotics,whichiscalledmobilerobotics.Thisareaofresearchisconcernedwithallsortsofrobotsthatmoveintheair,underwater,orjustontheground.
Bymanipulation,wemeanaprocessofmovinganobjectfromoneplacetoanother.Thisprocessisofhugeinteresttomanipulatorrobotics,whichisconcernedmostlywithallsortsofroboticarmsthatinthevastmajorityofcases,areusedinindustry.
Justforthesakeofcompleteness,whatarethedifferenteffectorsourrobotscanmakeuseof?Amongthemostbasicones,itwilldefinitelybemotorsofallsortsalongwithsomewheelsthatwillallowtherobottomovearound.
Oncewehavedatafromtheenvironment,wecanalsoactonit.Thereisjustonepiecemissinghere:thelinkbetweenthem.
ControllersAfterall,wefinallycametotheconclusionofthiswholesystem.Ifitwasnotforcontrollers,arobotcouldnevereverbefullyautonomous.Thisistousedatafromsensorstodecidewhattodonextandthenexecutesomeactionsusingeffectors.Thismaylooklikeasimpledescription,butintheend,itturnsoutthatcontrollersarequitedifficulttogetright,especiallywhenyouareplayingwiththemforthefirsttime.
Formostmobilerobotsandvastmajorityofhobbyrobots,controllersareusuallymicroprocessorsthatareprogrammedinsomelow-levelprogramminglanguage.It'salsonotuncommonforarobottousemultiplecontrollers.However,whileitdefinitelyhelpstohaveabackupcontrollerreadyincaseyourmainonebrakesdownandgreattohaveamodularsysteminwhicheverythingisitsownmodule(andhasitsowncontroller),youdonotgetthisforfree.Thepriceyouhavetopayisthecommunicationbetweencontrollers,whichrequiresagooddealofexpertise.
Nowthatwehaveallthebuildingblocksforarobotready,weshouldatleastbrieflydiscussthewaysinwhichtheycanbeorganized.Thismightnotseemimportant,butitturnsoutthathavingagooddesignupfrontcansaveusalotofeffort,energy,andresources.So,let'sdiveintohowwecanputarobottogetherarchitecturally.
![Page 35: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/35.jpg)
Howdowebuildarobot?Ifwetrytolookatthepartsofarobotfromthepreviouspartofthischapterinanabstractfashion,thereareessentiallythreeprocessestakingplace:sensing(donebysensors),acting(donebyeffectors),andthenplanning(ifthereisany,it'sdonebycontrollers).Dependingonhowweputthesethreeprocessestogether(astheyarethebuildingblockstheyarealsocalledprimitives),wecangetdifferentarchitectureswithdifferentproperties.Let'satleastsaysomethingaboutthethreeverybasicarchitectures(alsocalledparadigms).
ReactivecontrolReactivecontrolisprobablythesimplestarchitecture(orparadigm)onecanputtogetherwiththeprimitivesdescribedpreviously.Inthisparadigm,aswecanseeinthefollowingfigure,thereisnoplanningprocessinvolved.Thereisadirectconnectionbetweensensingandacting,whichmeansthatassoonassomesensorydatacomesin,theeffectorsactontheenvironmentinsomepredefinedway:
Justasthereflexesinyourbodydonotsendtheinformationaboutsomethinghappeningallthewayuptothebrain(whichwouldbequiteslow),butratherjusttothenearestspinalcordsothattheresponsecouldbefast,areactively-controlledrobotwillnothaveanycomplexcomputation,butfast,precomputedactionsthatwillbestoredsomewhere.
Hierarchical(deliberative)controlSupposeyouwereprogrammingachessplayingrobotwiththerulesofordinarychess,itwouldbeyourrobot'sturn,thenyourrobot'sopponent's,andsoon.It'sobviousthatinasettinglikethis,yourrobotdoesnotreallyneedtobeextremelyfast.However,itwillbegreatifitdidsomeplanningaboutthefuturesothatitcananticipatethefutureopponent'sturnsandthenadjustitsstrategy,basedontheopponent'scurrentturn.
Asetuplikethiswillbeperfectforhierarchical(ordeliberative)controlparadigm.Asyoucanseeinthefollowingfigure,theloopofplanning,acting,andsensingisclosed.Thus,thesystemcanactivelymovetowardsitsgoal,whateverthatmightbe:
![Page 36: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/36.jpg)
HybridcontrolSofar,wediscussedcontrolparadigmsthatwaseitherfastbutnotveryflexible,orsmartbutquiteslow.Whatwewillreallyneedinmanycasesissomethinginbetween.Also,thisispreciselywhatahybridcontrolparadigmtriestooffer.
Howcanweusethisinareal-liferobot?Supposewewanttobuildaroboticwaiterthatwouldservedrinksinacoffeeshop(coincidentally,thatiswhatmostofthisbookisabout).Suchawaiterwoulddefinitelyneedtohaveitsowninternalrepresentationofthecoffeeshop(wherearethetablesandchairslocated,andsoon).Onceit'sgivenataskofdeliveringacupofcoffeetoagivencustomer,itwillhavetoplanitspathandthenmovealongsidethatpath.Aswecanexpectthiscoffeeshoptobequiteagoodone,theremaybeotherguestsinsidetoo.Wecannotletourrobotbumpintoanychairoratable,letalonecollidingwithacustomerrandomlywhileit'stryingtodelivercoffee.Forthis,weneedawelltunedreactivecontroller.
Thefollowingfigureshowstheschematicsofthehybridcontrolparadigm.Wecanseethattherobotatfirstplansitstask,butbreaksitdownitintoseriesofactionsthatcanbeexecutedbythereactiveparadigm.Oneinterestingthingtonotehereisthefactthatthesensorydataisavailabletoaidtheplanning(asitneedstodosomeplanning)andtheacting(asitdoesthereactivecontrol)partsofthesystem:
![Page 37: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/37.jpg)
That'saboutit!Now,youknowwhatarobotis,whatmakesitarobot,whereitcamefrom,thepartsneededtocreatearobot,andhowyoucanarchitecturallyputittogether.It'sabouttimeyoubuildoneyourself!
![Page 38: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/38.jpg)
SummaryInthischapter,youlearnedwhatarobotactuallyisandwherethistermcamefrom.Wedidourbesttodefinearobotasanautonomousmachinethatexistsinaphysicalworld,cansenseitsenvironment,andcanactonittoachievesomegoals.Wealsowentthroughabriefhistoryofthefieldofroboticsanddiscoveredthatmanyinterestingmachineswerebuiltpriortotheeraofrealrobots(fromourdefinition).Lateron,wediscussedthebasicbuildingblocksofarobot,thatis,effectors,sensors,andcontrollers,whichcanbecombinedinnumerousways.Finally,wedugabitdeeperintothearchitectureofcontrolsystemsthatareusefultokeepinmindwhendesigningarobot.
Inthenextchapter,wewillfinallyseesomerealrobotsalongwitharealprogramminglanguage.
![Page 39: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/39.jpg)
Chapter2.MechanicalDesignofaServiceRobotThemainpurposeofthisbookistolearnroboticsbydesigningandbuildingrobotsandprogrammingitusingPython.Tolearnrobotics,wewillfirstlookathowtomechanicallydesignarobotfromscratch.Therobotthatwearegoingtobuildisusedasaservicerobotinhotelsandrestaurantstoservefoodanddrinks.
Inthischapter,wecanseevariousmechanicalcomponentsusedinthisrobot.Also,wecanseehowtoassembleitscomponents.WecandesignandassemblethepartsusingCADtoolandalsobuilda3Dmodelofrobotforsimulatingtherobot.
Theactualrobotdeployedinhotelsmaybebig,buthereweareintendingtobuildaminiatureversionofitonlyfortestingourtechnology.Ifyouareinterestedtobuildarobotfromscratch,thischapterisforyou.Ifyouarenotinterestedtobuildtherobot,youcanchoosesomeroboticplatformsalreadyavailableonthemarkettoworkwiththisbook.
Tobuildtherobotbody,wefirstneedtoknowtherequirementsofdesigningtherobot;aftergettingtherequirements,wecandesignitanddrawthemodelin2DCADtoolstomanufacturetherobotparts.Wecanalsodiscussthe3Dmodeltosimulatetherobotforthenextchapter.
TheRequirementsofaservicerobotBeforedesigninganyroboticsystem,thefirstprocedureistoidentifyitsrequirements.Thefollowingareasetofhardwarerequirementstobemetbythisrobot:
TherobotshouldhaveaprovisiontocarryfoodTherobotshouldbeabletocarryamaximumpayloadof5kgTherobotshouldtravelataspeedbetween0.25m/sand1m/sThegroundclearanceoftherobotshouldbegreaterthan3cmTherobotmustbeabletoworkfor2hourscontinuouslyTherobotshouldbeabletomoveandsupplyfoodtoanytableavoidingobstaclesTherobotheightshouldbebetween40cmand1meterTherobotshouldbeoflowcost
Now,wecanidentifythemechanicaldesignrequirementssuchaspayload,movingspeed,groundclearance,robotheight,andthecostoftherobot.Wewilldesignthebodyandselectcomponentsaccordingly.Let'sdiscusstherobotmechanismwecanusetomatchtheserequirements.
![Page 40: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/40.jpg)
RobotdrivemechanismOneofthecosteffectivesolutionformobilerobotnavigationisdifferentialdrivesystems.It'soneofthesimplestdrivemechanismsforamobilerobotthatismainlyindentedforindoornavigation.Thedifferentialdriverobotconsistsoftwowheelsmountedonacommonaxiscontrolledbytwoseparatemotors.Therearetwosupportingwheelscalledcasterwheels.Itensuresstabilityandweightdistributionoftherobot.Thefollowingdiagramshowsatypicaldifferentialdrivesystem:
Differentialdrivesystem
Thenextstepistoselectthemechanicalcomponentsofthisrobotdrivesystem,thatis,mainlymotors,wheels,androbotchassis.Basedontherequirements,wewillfirstdiscusshowtoselectthemotor.
SelectionofmotorsandwheelsMotorsareselectedafterlookingattheirspecifications.SomeoftheimportantparametersformotorselectionaretorqueandRPM.Wecancomputethesevaluesfromthegivenrequirements.
CalculationofRPMofmotors
Assumetherequiredrobot'sspeedas0.35m/s.Wesawthespeedofrobotmustbewithin0.25m/sto1m/s,aspertherequirement.Takethediameterofthewheelas9cmbecauseaccordingtotherequirement,thegroundclearanceshouldbegreaterthan3cm.Usingthefollowingequation,wecancalculatetheRPMofmotors:
RPM=((60*Speed/(3.14*DiameterofWheel)
![Page 41: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/41.jpg)
RPM=(60*0.35)/(3.14*0.09)=21/0.2826=74RPM
Tip
Youcanalsotakealookathttp://www.robotshop.com/blog/en/vehicle-speed-rpm-and-wheel-diameter-finder-9786forcomputation.
ThecalculatedRPMwith9cmdiameterwheeland0.35m/sspeedis74RPM.Wecanconsider80RPMasthestandardvalue.
Calculationofmotortorque
Let'scalculatethetorquerequiredtomovetherobot:
1. Noofwheels=Fourwheelsincludingtwocasterwheels.2. Noofmotors=Two.3. Let'sassumethecoefficientoffrictionis0.6andradiusofwheelis4.5cm.4. Taketotalweightofrobot=weightofrobot+payload=(W=mg)=(~100N+~50N)W=~
150N,whereastotalmass=15Kg5. Theweightactingonthefourwheelscanbewrittenas2*N1+2*N2=W,thatis,N1isthe
weightactingoneachcasterwheelandN2oneachmotorwheel.6. Assumethattherobotisstationary.Themaximumtorqueisrequiredwhentherobotstarts
moving.Itshouldalsoovercomefriction.7. Wecanwritethefrictionalforceasrobottorque=0untiltherobotmoves.Ifwegettherobot
torqueinthiscondition,wegetthemaximumtorqueasfollows:
µ*N*r-T=0,whereµisthecoefficientoffriction,Nistheaverageweightactingoneachwheel,ristheradiusofwheels,andTisthetorque.
N=W/4(assumingthattheweightoftherobotisequallydistributedonallthefourwheels)
Therefore,weget:
0.6*(150/4)*0.045-T=0
Hence,T=1.0125N-mor10.32Kg-cm
ThedesignsummaryAfterdesign,wecalculatedthefollowingvalues:
MotorRPM=80MotorTorque=10.32kg-cmWheeldiameter=9cm
RobotchassisdesignAftercomputingtherobot'smotorandwheelparameters,wecandesigntherobotchassisorrobot
![Page 42: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/42.jpg)
body.Asrequired,therobotchassisshouldhaveaprovisiontoholdfood,itshouldbeabletowithstandupto5kgpayload,thegroundclearanceoftherobotshouldbegreaterthan3cmanditshouldbelowincost.Apartfromthis,therobotshouldhaveaprovisiontoplaceelectronicscomponentssuchasPersonalComputer(PC),sensors,andbattery.
Oneoftheeasiestdesignstosatisfytheserequirementsisatable-likedesign.TheTurtleBot(http://www.turtlebot.com/)designisakindoftable-likedesign.Ithasthreelayersinthechassis.ArobotplatformcalledRoombaisthedrivemechanismofthisplatform.TheRoombaplatformhasmotorsandsensorsinbuilt,sononeedtoworryaboutthedesigningofrobothardware.ThefollowingfigureshowstheTurtleBotrobotchassisdesign:
TurtleBotRobot
WewilldesignarobotsimilartoTurtleBotwithourownmovingplatformandcomponents.Ourdesignwillalsohaveathreelayerarchitecture.Let'sseewhatalltoolswewantbeforewestartdesigning.
Beforewestartdesigningtherobotchassis,weneedtoknowaboutComputer-aideddesign(CAD)tools.ThepopulartoolsavailableforCADare:
SolidWorks(http://www.solidworks.com/default.htm)AutoCAD(http://www.autodesk.com/products/autocad/overview)Maya(http://www.autodesk.com/products/maya/overview)Inventor(http://www.autodesk.com/products/inventor/overview)GoogleSketchUp(http://www.sketchup.com/)Blender(http://www.blender.org/download/)LibreCAD(http://librecad.org/cms/home.html)
![Page 43: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/43.jpg)
Thechassisdesigncanbedesignedusinganysoftwareyouarecomfortablewith.Here,wewilldemonstratethe2DmodelinLibreCADandthe3DmodelinBlender.OneofthehighlightsoftheseapplicationsisthattheyarefreeandavailableforallOSplatforms.Wewillusea3DmeshviewingtoolcalledMeshLabtoviewandcheckthe3DmodeldesignanduseUbuntuasthemainoperatingsystem.Also,wecanseetheinstallationproceduresoftheseapplicationsinUbuntu14.04.2tostartthedesigningprocess.Wewillprovidetutoriallinkstoinstallapplicationsinotherplatformstoo.
![Page 44: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/44.jpg)
InstallingLibreCAD,Blender,andMeshLabLibreCADisafree,opensource2DCADapplicationforWindows,OSX,andLinux.Blenderisafree,opensource3Dcomputergraphicssoftwareusedtocreate3Dmodels,animation,andvideogames.ItcomeswithaGPLlicenseasperwhichuserscanshare,modify,anddistributetheapplication.MeshLabisanopensource,portable,andextensiblesystemtoprocessandeditunstructured3Dtriangularmeshes.
ThefollowingarethelinkstoinstallLibreCADinWindows,Linux,andOSX:
Visithttp://librecad.org/cms/home.htmltodownloadLibreCADVisithttp://librecad.org/cms/home/from-source/linux.htmltobuildLibreCADfromsourceVisithttp://librecad.org/cms/home/installation/linux.htmltoinstallLibreCADinDebian/UbuntuVisithttp://librecad.org/cms/home/installation/rpm-packages.htmltoinstallLibreCADinFedoraVisithttp://librecad.org/cms/home/installation/osx.htmltoinstallLibreCADinOSXVisithttp://librecad.org/cms/home/installation/windows.htmltoinstallLibreCADinWindows
Note
WecanfindthedocumentationonLibreCADatthefollowinglink:
http://wiki.librecad.org/index.php/Main_Page.
InstallingLibreCADTheinstallationprocedureforalloperatingsystemsisprovided.IfyouareanUbuntuuser,youcansimplyinstallitfromtheUbuntuSoftwareCentreaswell.
InstallingBlenderVisitthefollowingdownloadpagetoinstallBlenderforyourOSplatform:http://www.blender.org/download/.YoucanfindthelatestversionofBlenderhere.Also,youcanfindthelatestdocumentationonBlenderathttp://wiki.blender.org/.
IfyouareusingUbuntu/Linux,youcansimplyinstallBlenderviaUbuntuSoftwareCentre.
InstallingMeshLabMeshLabisavailableforallOSplatforms.ThefollowinglinkwillprovideyouthedownloadlinksofprebuiltbinariesandsourcecodeofMeshLab:
http://meshlab.sourceforge.net/
IfyouareanUbuntuuser,youcaninstallMeshLabfromanaptpackagemanagerusingthefollowingcommand:
$sudoapt-getinstallmeshlab
![Page 45: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/45.jpg)
Creatinga2DCADdrawingoftherobotusingLibreCADWecantakealookatthebasicinterfaceofLibreCAD.ThefollowingscreenshotshowstheinterfaceofLibreCAD:
ACADtoolbarhasthenecessarycomponentstodrawamodel.ThefollowingscreenshotshowsthedetailedoverviewoftheCADtoolbar:
![Page 46: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/46.jpg)
AdetaileddescriptionofLibreCADtoolsisavailableatthefollowinglink:
http://wiki.librecad.org/index.php/LibreCAD_users_Manual.
CommandBox:Thisisusedtodrawfiguresbyonlyusingcommands.Wecandrawdiagramswithouttouchinganytoolbar.Adetailedexplanationabouttheusageofthecommandboxcanbefoundat:
http://wiki.librecad.org/index.php/A_short_manual_for_use_from_the_command_line.LayerList:Thiswillhavelayersusedinthecurrentdrawing.Abasicconceptincomputer-aideddraftingistheuseoflayerstoorganizeadrawing.Adetailedexplanationoflayerscanbefoundat:
http://wiki.librecad.org/index.php/Layers.Block:Thisisagroupofentitiesandcanbeinsertedinthesamedrawingmorethanoncewithdifferentattributesatdifferentlocations,differentscale,androtationangle.AdetailedexplanationofBlockscanbefoundatthefollowinglink:
http://wiki.librecad.org/index.php/Blocks.AbsoluteZero:Thisistheoriginofthedrawing(0,0).
![Page 47: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/47.jpg)
Now,startsketchingbysettingtheunitofdrawing.Setthedrawingunittocentimeter.OpenLibreCAD,navigatetoEdit|ApplicationPreference.SetUnitasCentimeters,asshowninthefollowingscreenshot:
Let'sstartwiththebaseplatedesign.Thebaseplatehasprovisionstoconnectmotors,placebattery,andcontrolboard.
ThebaseplatedesignThefollowingfigureshowstherobot'sbaseplate.Thisplateprovidesprovisionsfortwomotorsfordifferentialdriveandeachcasterwheelonthefrontandbackofthebaseplate.MotorsarementionedasM1andM2inthediagramandcasterwheelsarerepresentedasC1andC2.Italsoholdsfourpolestoconnecttothenextplates.PolesarerepresentedasP1-1,P1-2,P1-3,andP1-4.ThescrewsareindicatedasSandwewillusethesamescrewshere.Thereisaholeatthecentertobringthewiresfromthemotortothetopoftheplate.Theplateiscutontheleft-handsideandtheright-handsidesothatthewheelscanbeattachedtothemotor.Thedistancefromthecentertocasterwheelsismentionedas12.5cmandthedistancefromthecentertomotorsismentionedas5.5cm.Thecenterofpolesisat9cminlengthand9cminheightfromthecenter.Theholesofalltheplatesfollowthesamedimensions:
![Page 48: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/48.jpg)
Thedimensionsarenotmarkedonthediagram;instead,it'smentionedinthefollowingtable:
Parts Dimension(cm)(LengthxHeight)(radius)
M1andM2 5x4
C1andC2 Radius=1.5
S(Screw) 0.15
P1-1,P1-2,P1-3,P1-4 Outerradius0.7,Height3.5
LeftandRightWheelSections 2.5x10
Baseplate Radius=15
Wecandiscussmoreaboutmotordimensionsandclampdimensionslater.
Baseplatepoledesign
![Page 49: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/49.jpg)
Thebaseplatehasfourpolestoextendtothenextlayer.Thepolesare3.5cminlengthwitharadiusof0.7cm.Wecanextendtothenextplatebyattachinghollowtubestothepoles.Atthetopofthehollowtube,wewillinsertahardplastictomakeascrewhole.Thisholewillbeusefultoextendtothetoplayer.Thebaseplatepoleandthehollowtubesoneachpoleisshowninthefollowingfigure.Eachhollowtubearadiusof0.75cmandlengthof15cm:
Wheel,motor,andmotorclampdesignWehavetodecidethediameterofthewheelandcomputemotorrequirements.Here,weareusingatypicalmotorandwheelthatwecanuseifthedesignissuccessful:
Themotordesigncanvaryaccordingtothemotorselection;ifnecessary,thismotorcanbetakenasperthedesignandcanbechangedaftersimulation.TheXvalueinthemotordiagramcanvaryaccordingtothespeedandtorqueofmotors.Thisisthegearassemblyofmotor.
Thefollowingfigureshowsatypicalwheelthatwecanusewithadiameterof90cm.Thewheelwithadiameterof86.5mmwillbecome90mmafterplacingthegrip.
![Page 50: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/50.jpg)
Themotorneedstobemountedonthebaseplate;tomount,weneedaclampwhichcanbescrewedontotheplateandalsoconnectthemotortotheclamp.Thefollowingfigureshowsatypicalclampwecanuseforthispurpose.It'sanL-shapedclamp,withwhichwecanmountthemotorononesideandfitanothersidetotheplate:
Casterwheeldesign
![Page 51: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/51.jpg)
Casterwheelsneednothaveaspecialdesign;wecanuseanycasterwheelthatcantouchthegroundsimilartothewheels.Thefollowinglinkhasacollectionofcasterwheelsthatwecanuseforthisdesign:
http://www.pololu.com/category/45/pololu-ball-casters
MiddleplatedesignThedimensionofthisplateissameasthebaseplate,andthescrewsizeisalsosimilar:
Themiddleplatecanbeheldabovethehollowtubesfromthebaseplate.Thisarrangementisconnectedusinganotherhollowtubethatextendsfromthemiddleplate.Thetubefromthemiddleplatewillhaveascrewatthebottomtofixthetubefromthebaseplatetothemiddleplate,andahollowendtoconnectthetopplate.Thetopandsideviewofthetubeextendingfromthemiddleplateisshowninthefollowingfigure:
![Page 52: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/52.jpg)
Thistubewillconnectthemiddleplatetothebaseplateandatthesametimeprovideaprovisiontoconnectthetopplate.
TopplatedesignThetopplateissimilartootherplates;ithasfoursmallpolesof3cm,similartothebaseplate.Thesepolescanbeplacedonthehollowtubesfromthemiddleplate.Thefourpolesareconnectedtotheplate,shownasfollows:
![Page 53: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/53.jpg)
Afterthetopplatedesign,therobotchassisdesignisalmostfinished;let'sseethe3DmodelbuildingofthisrobotusingBlender.The3Dmodelisbuiltforsimulationpurposeandthe2Ddesignwebuildismainlyformanufacturingpurpose.
![Page 54: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/54.jpg)
Workingwitha3DmodeloftherobotusingBlenderInthissection,wewilldesigna3Dmodeloftherobot.The3Dmodelismainlyusedforsimulationpurpose.ThemodelingwillbedoneusingBlender.Theversionmustbegreaterthan2.6becauseweonlytestedthetutorialsontheseversions.
Thefollowingscreenshotshowstheblenderworkspaceandtoolsthatcanbeusedtoworkwith3Dmodels:
ThemainreasonwhyweareusingBlenderhereissothatwecanmodeltherobotusingPythonscripts.BlenderhasaninbuiltPythoninterpreterandaPythonscripteditorforcodingpurpose.WearenotdiscussingabouttheuserinterfaceofBlenderhere.WecanfindagoodtutorialofBlenderonitswebsite.RefertothefollowinglinktolearnaboutBlender'suserinterface:
http://www.blender.org/support/tutorials/
Let'sstartcodinginBlenderusingPython.
PythonscriptinginBlender
![Page 55: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/55.jpg)
BlenderismainlywritteninC,C++,andPython.UserscanwritetheirownPythonscriptandaccessallthefunctionalitiesofBlender.IfyouareanexpertinBlenderPythonAPIs,youcanmodeltheentirerobotusingaPythonscriptinsteadofmanualmodeling.
BlenderusesPython3.x.Blender.PythonAPIisgenerallystable,butsomeareasarestillbeingaddedtoandimproved.Refertohttp://www.blender.org/documentation/blender_python_api_2_69_7/forthedocumentationonBlenderPythonAPI.
Let'sdiscussBlenderPythonAPIsthatwewilluseinourrobotmodelscript.
IntroductiontoBlenderPythonAPIsPythonAPIsinBlendercandomostofthefunctionalitiesofBlender.ThemainjobsthatcanbedonebytheseAPIsareasfollows:
EditanydatainsideBlender,suchasscenes,meshes,particles,andsoonModifyuserpreference,keymaps,andthemesCreatenewBlendertoolsDrawthe3DviewusingOpenGLcommandsfromPython
BlenderprovidesthebpymoduletothePythoninterpreter.Thismodulecanbeimportedinascriptandgivesaccesstoblenderdata,classes,andfunctions;scriptsthatdealwithBlenderdatawillneedtoimportthismodule.ThemainPythonmoduleswewilluseinbpyare:
ContextAccess:ThisprovidesaccesstoBlenderuserinterfacefunctionsfromthe(bpy.context)script.DataAccess:ThisprovidesaccesstotheBlenderinternaldata(bpy.data).Operators:ThisprovidesPythonaccesstocallingoperators,whichincludesoperatorswritteninC,Python,orMacros(bpy.ops).
ForswitchingtoscriptinginBlender,weneedtochangethescreenlayoutofBlender.ThefollowingscreenshotshowstheoptionthathelpsyoutoswitchtoScriptinglayout:
![Page 56: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/56.jpg)
AfterselectingtheScriptingtab,wecanseeatexteditorandPythonconsolewindowinBlender.Inthetexteditor,wecancodeusingBlenderAPIsandalsotryPythoncommandsviathePythonconsole.ClickontheNewbuttontocreateanewPythonscriptandnameitrobot.py.Now,wecandesignthe3DmodelofrobotusingonlyPythonscripts.Theupcomingsectionhasthecompletescripttodesignourrobotmodel.Wecandiscussthecodebeforerunningit.WehopeyouhavereadthePythonAPIsofBlenderfromtheirsite.ThecodeintheupcomingsectionissplitintosixPythonfunctionstodrawthreerobotplates,drawmotorsandwheels,drawfoursupporttubes,andexportintotheSTereoLithography(STL)3Dfileformatforsimulation.
PythonscriptoftherobotmodelThefollowingisthePythonscriptoftherobotmodelthatwewilldesign:.
1. BeforestartingPythonscriptinBlender,wemustimportthebpymodule.ThebpymodulecontainsallthefunctionalitiesofBlenderanditcanonlybeaccessedfrominsidetheBlenderapplication:
importbpy
2. Thisfollowingfunctionwilldrawthebaseplateoftherobot.Thisfunctionwilldrawacylinderwitharadiusof5cmandcutaportionfromtheoppositesidessothatmotorscanbeconnectedusingtheBooleanmodifierinsideBlender:
#Thisfunctionwilldrawbaseplate
![Page 57: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/57.jpg)
defDraw_Base_Plate():
3. Thefollowingtwocommandswillcreatetwocubeswitharadiusof0.05meteroneithersideofthebaseplate.Thepurposeofthesecubesistocreateamodifierthatsubtractsthecubesfromthebaseplate.Soineffect,wewillgetabaseplatewithtwocuts.Aftercuttingthetwosides,wewilldeletethecubes:
bpy.ops.mesh.primitive_cube_add(radius=0.05,location=(0.175,0,0.09))
bpy.ops.mesh.primitive_cube_add(radius=0.05,location=(-0.175,0,0.09))
####################################################
####################################################
#Addingbaseplate
bpy.ops.mesh.primitive_cylinder_add(radius=0.15,depth=0.005,location=
(0,0,0.09))
#Addingbooleandifferencemodifierfromfirstcube
bpy.ops.object.modifier_add(type='BOOLEAN')
bpy.context.object.modifiers["Boolean"].operation='DIFFERENCE'
bpy.context.object.modifiers["Boolean"].object=bpy.data.objects["Cube"]
bpy.ops.object.modifier_apply(modifier="Boolean")
######################################################
######################################################
#Addingbooleandifferencemodifierfromsecondcube
bpy.ops.object.modifier_add(type='BOOLEAN')
bpy.context.object.modifiers["Boolean"].operation='DIFFERENCE'
bpy.context.object.modifiers["Boolean"].object=
bpy.data.objects["Cube.001"]
bpy.ops.object.modifier_apply(modifier="Boolean")
###############################################################################
###############################
#Deselectcylinderanddeletecubes
bpy.ops.object.select_pattern(pattern="Cube")
bpy.ops.object.select_pattern(pattern="Cube.001")
bpy.data.objects['Cylinder'].select=False
bpy.ops.object.delete(use_global=False)
4. Thefollowingfunctionwilldrawthemotorsandwheelsattachedtothebaseplate:
#Thisfunctionwilldrawmotorsandwheels
defDraw_Motors_Wheels():
5. Thefollowingcommandswilldrawacylinderwitharadiusof0.045and0.01meterindepthforthewheels.Aftercreatingthewheels,itwillberotatedandtranslatedintothecutportionofthebaseplate:
#CreatefirstWheel
![Page 58: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/58.jpg)
bpy.ops.mesh.primitive_cylinder_add(radius=0.045,depth=0.01,location=
(0,0,0.07))
#Rotate
bpy.context.object.rotation_euler[1]=1.5708
#Transalation
bpy.context.object.location[0]=0.135
#Createsecondwheel
bpy.ops.mesh.primitive_cylinder_add(radius=0.045,depth=0.01,location=
(0,0,0.07))
#Rotate
bpy.context.object.rotation_euler[1]=1.5708
#Transalation
bpy.context.object.location[0]=-0.135
6. Thefollowingcodewilladdtwodummymotorstothebaseplate.Thedimensionsofmotorsarementionedinthe2Ddesign.Themotorisbasicallyacylinderanditwillberotatedandplacedinthebaseplate:
#Addingmotors
bpy.ops.mesh.primitive_cylinder_add(radius=0.018,depth=0.06,location=
(0.075,0,0.075))
bpy.context.object.rotation_euler[1]=1.5708
bpy.ops.mesh.primitive_cylinder_add(radius=0.018,depth=0.06,location=
(-0.075,0,0.075))
bpy.context.object.rotation_euler[1]=1.5708
7. Thefollowingcodewilladdashafttothemotors,similartothemotormodel;theshaftisalsoacylinderanditwillberotatedandinsertedintothemotormodel:
#Addingmotorshaft
bpy.ops.mesh.primitive_cylinder_add(radius=0.006,depth=0.04,location=
(0.12,0,0.075))
bpy.context.object.rotation_euler[1]=1.5708
bpy.ops.mesh.primitive_cylinder_add(radius=0.006,depth=0.04,location=
(-0.12,0,0.075))
bpy.context.object.rotation_euler[1]=1.5708
###############################################################################
###############################
8. Thefollowingcodewilladdtwocasterwheelsonthebaseplate.Currently,weareaddingacylinderaswheel.Inthesimulation,wecanassignitasawheel:
#AddingCasterWheel
bpy.ops.mesh.primitive_cylinder_add(radius=0.015,depth=0.05,location=
(0,0.125,0.065))
bpy.ops.mesh.primitive_cylinder_add(radius=0.015,depth=0.05,location=
(0,-0.125,0.065))
9. ThefollowingcodewilladdadummyKinectsensor:
![Page 59: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/59.jpg)
#AddingKinect
bpy.ops.mesh.primitive_cube_add(radius=0.04,location=(0,0,0.26))
10. Thisfunctionwilldrawthemiddleplateoftherobot:
#Drawmiddleplate
defDraw_Middle_Plate():
bpy.ops.mesh.primitive_cylinder_add(radius=0.15,depth=0.005,location=
(0,0,0.22))
#Addingtopplate
defDraw_Top_Plate():
bpy.ops.mesh.primitive_cylinder_add(radius=0.15,depth=0.005,location=
(0,0,0.37))
11. Thisfunctionwilldrawallthefoursupportinghollowtubesforallthethreeplates:
#Addingsupporttubes
defDraw_Support_Tubes():
###############################################################################
##############
#Cylinders
bpy.ops.mesh.primitive_cylinder_add(radius=0.007,depth=0.30,location=
(0.09,0.09,0.23))
bpy.ops.mesh.primitive_cylinder_add(radius=0.007,depth=0.30,location=
(-0.09,0.09,0.23))
bpy.ops.mesh.primitive_cylinder_add(radius=0.007,depth=0.30,location=
(-0.09,-0.09,0.23))
bpy.ops.mesh.primitive_cylinder_add(radius=0.007,depth=0.30,location=
(0.09,-0.09,0.23))
12. ThisfunctionwillexportthedesignedrobottoSTL.WehavetochangetheSTLfilepathbeforeexecutingthescript:
#ExportingintoSTL
defSave_to_STL():
bpy.ops.object.select_all(action='SELECT')
#bpy.ops.mesh.select_all(action='TOGGLE')
bpy.ops.export_mesh.stl(check_existing=True,
filepath="/home/lentin/Desktop/exported.stl",filter_glob="*.stl",ascii=False,
use_mesh_modifiers=True,axis_forward='Y',axis_up='Z',global_scale=1.0)
#Maincode
if__name__=="__main__":
Draw_Base_Plate()
Draw_Motors_Wheels()
Draw_Middle_Plate()
Draw_Top_Plate()
Draw_Support_Tubes()
Save_to_STL()
13. Afterenteringthecodeinthetexteditor,executethescriptbypressingtheRunScriptbutton,asshowninthefollowingscreenshot.Theoutput3Dmodelwillbeshownonthe3DviewofBlender.Also,ifwecheckthedesktop,wecanseetheexported.stlfileforthesimulation
![Page 60: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/60.jpg)
purposes:
Theexported.stlfilecanbeopenedwithMeshLabandthefollowingisascreenshotofMeshLab:
![Page 61: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/61.jpg)
Tip
Downloadingtheexamplecode
YoucandownloadtheexamplecodefilesforallPacktbooksyouhavepurchasedfromyouraccountathttp://www.packtpub.com.Ifyoupurchasedthisbookelsewhere,youcanvisithttp://www.packtpub.com/supportandregistertohavethefilese-maileddirectlytoyou.
![Page 62: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/62.jpg)
Questions1. Whatisrobotmodelingandwhatareitsuses?2. Whatistheaimof2Drobotmodel?3. Whatistheaimof3Drobotmodel?4. WhatistheadvantageofPythonscriptingovermanualmodeling?
![Page 63: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/63.jpg)
SummaryThischapterwasmainlyaimedatrobotmechanicaldesigning.Italsoincludedtherobotparametercalculationandrobotchassisdesign.Inrobotdesigning,wefirstneedtohavetheprerequisitesready.Onceit'sready,wecancalculatetherequirementsofthecomponentstobeusedintherobot.Afterthecomponentrequirementsaremet,wedesigntherobotchassisaccordingtothegivenrequirements.Therobotchassisdesigninvolves2Ddesignofallthepartsrequiredtobuildtherobot.After2Ddesigning,wesawhowtobuildthe3DrobotmodelusingBlenderandPythonscript.The3Dmodelwasbuiltusingthedimensionsthatweusedin2Ddrawing.WealsocoveredtheBlenderPythonscripttobuildtheentire3Dmodel.Inthischapter,wegotthedesignoftherobotthatcanbeusedtomanufactureit,andalsodevelopeda3Dmodelforsimulation.Inthenextchapter,wewilldiscussthesimulationofthisrobotmodelandsomepopularsimulationtools.
![Page 64: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/64.jpg)
Chapter3.WorkingwithRobotSimulationUsingROSandGazeboInthelastchapter,welookedatthemechanicaldesigningofourrobotanddesignedits2Dand3Dmodels.Inthischapter,wewillsimulatetherobotthatwedesigned.Beforedivingintosimulation,wewilllookattheusesofrobotsimulation,advantages,disadvantages,andvariousroboticsoftwaresimulationtools.
Wewillalsodiscusskinematicsandthedynamicparametersoftherobotthatwillhelpyoutounderstandthefunctioningoftherobot.Afterdiscussingtheseconcepts,wewilldiscussthesoftwareplatformsthatareusedforthissimulation.WeareplanningtoperformthissimulationusingGazebowiththehelpofRobotOperatingSystem(ROS).AfterwediscussthebasicconceptsofROSandGazebo,wewillimplementtherobotkinematicanddynamicmodeloftherobotaccordingtotheGazebodescriptions.Finally,wewillsimulatetherobotinatypicalhotelenvironmentandtesttheautonomousnavigationabilityoftherobottoservefood.
UnderstandingroboticsimulationIngeneral,roboticsimulationisaprocessofdevelopingavirtualmodelcapableofemulatingthereal-worldprocess.Throughsimulation,wecancreateavirtualmodeloftherobotandtestitsdesignandprogrammingcode.
OneofthedefinitionsofsimulationaccordingtoSystemsSimulation:TheArtandScience,RobertE.Shannon,PrenticeHallis:
Itistheprocessofdesigningamodelofarealsystemandconductingexperimentswiththismodelforthepurposeofunderstandingthebehaviorofthesystemandforevaluatingvariousstrategiesfortheoperationofthesystem.Thusitiscriticalthatthemodebedesignedinsuchawaythatthemodelbehaviormimicstheresponsebehavioroftherealsystemtoeventsthattakeplaceovertime.
Theterm'smodelandsystemarekeycomponentsofourdefinitionofsimulation.Byamodelwemeanarepresentationofagroupofobjectsorideasinsomeformotherthanthatoftheentityitself.Byasystemwemeanagrouporcollectionofinterrelatedelementsthatcooperatetoaccomplishsomestatedobjective.
Roboticsimulatorsaresoftwareapplicationsthatcanmodeltherobotandrenderthevirtualenvironmentthatmimicstherealenvironmentoftherobot.Inourcase,theenvironmentisatypicalhotel/restaurantwithtablesandchairs.Wehavetomakethissamearrangementinsimulatorstotestitsworking.
ThefollowingfigureshowsarobotsimulatorcalledGazebo.ItalsoshowsarobotcalledTurtleBot,alongwithsomerandomobjects.YouwilllearnmoreaboutGazeboandTurtleBotintheupcomingsectionsofthischapter.
![Page 65: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/65.jpg)
RobotsimulatorGazebo
Welookedattherequirementstobuildtherobotandthemechanicaldesignoftherobot.Thenextstepistosimulatethedesignprocess.Itallowsdeveloperstotesttheprogrammingcodeandvalidatethemechanicaldesignoftherobotaccordingtothedesignproposalrequest.Thevirtualmodeloftherobotcanbemodifiedwithoutanyadditionalcosts.
Oneofthemainadvantagesofperformingthesimulationprocessisthatwecanbuildavirtualprototypeofacomplexrobotwithlesscostthatbehavessimilartotheactualdesignoftherobot,andtestthevirtualrobotuntilitmeetsthespecifications.Thedisadvantageisthat,usingsimulators,wecannotcovertheentirescenariothatmayoccurintherealworld.
Theadvantagesofsimulationare:
LowcosttobuildarobotfromscratchTheactualrobotcodecanbetestedwiththesimulatedrobotThedesignoftherobotcanbemodifiedwithoutanycostAnypartoftherobotcanbetestedIfit'sacomplexproject,thentherobotcanbesimulatedinstagesAcompletesimulationcandeterminewhethertherobotmeetsthespecificationsAlmostallsimulationsoftwarearecompatiblewithawiderangeofprogramminglanguages
Someofthedisadvantageare:
Intherealworld,theremaybemoreparametersthanthevirtualworld;wecan'tmodelalltheseparametersinsimulationAllsimulationprogramssimulatewhattheyareprogrammedtosimulate
Let'stakealookatsomeofthelatestroboticsimulatorapplications:
Gazebo:Thisisamultirobotsimulatorwithsupportformanysensors.Thesoftwareis
![Page 66: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/66.jpg)
compatiblewithROS.It'safreeandopensourcesimulatorusedextensivelyforroboticresearch.TheofficialwebsiteofGazeboiswww.gazebosim.org.V-REP:Thisisoneofthemostadvanced3DsimulatorsforindustrialrobotsdesignedbyCoppeliaRobotics.Thistoolofferssupportforawiderangeofprogramminglanguages,includingC/C++,Python,Java,Lua,Matlab,andUrbi.Thissimulatorhasbuilt-insupporttodevelopalgorithmsinordertosimulateautomationscenarios.Theplatformisusedineducationaswellbyengineers.TheofficialwebsiteofV-REPishttp://www.coppeliarobotics.com/.Webots:Thisisa3DsimulationplatformdevelopedbyCyberboticsandisusedinserviceandindustrialrobotsimulations.ThistoolofferssupportforWindows,Linux,andAppleplatforms.It'soneofthemostcommonsimulationsoftwareusedineducationorforresearchpurposes.Anyrobotcanbemodeled,programmed,andsimulatedinC,C++,Java,Python,Matlab,orURBI.ThissoftwareiscompatiblewithexternallibrariessuchasOpenSourceComputerVision(OpenCV).RoboLogix:Thisisa3DindustrialsimulationsoftwaredevelopedbyLogicDesign.TheRoboLogixplatformwasdesignedtoemulatereal-worldroboticsapplicationswithfive-axisindustrialrobot.Theprograminstalledontherobotcanbedevelopedandtestedinawiderangeofpracticalapplications.Theplatformofferssupportforawiderangeofindustrialrobots,includingABB,Fanuc,andKawasaki.
Beforeperformingthesimulation,let'scheckhowtherobotworksandwhatisthemathbehindthis.
MathematicalmodelingoftherobotTheimportantpartofamobilerobotisitssteeringsystem.Thiswillhelptherobottonavigateintheenvironment.Wewillusethedifferentialdrivemodeltoreducethecomplexity,cost,andsizeoftherobot.Adifferential-driverobotconsistsoftwomainwheelsmountedonacommonaxiscontrolledbyseparatemotors.Adifferentialdrivesystem/steeringsystemisanonholonomicsystem,whichmeansithasconstraintsontheposechange.Acarisanexampleofanonholonomicsystem,asitcannotchangeitspositionwithoutchangingitspose.Let'slookathowourrobotworksandhowwemodeltherobotintermsofitsmathematics.
Introductiontothedifferentialsteeringsystemandrobotkinematics
Robotkinematicsisthestudyofthemathematicsofmotionwithoutconsideringtheforcesthataffectmotion.Itmainlydealswiththegeometricrelationshipsthatgovernthesystem.Robotdynamicsisthestudyofmotioninwhichalltheforcesaremodeled.
Amobilerobotorvehiclehassixdegreesoffreedom(DOF)expressedbythepose(x,y,z,roll,pitch,andyaw).Itconsistsofposition(x,y,z)andattitude(roll,pitch,andyaw).Rollreferstosidewiserotation,pitchreferstoforwardandbackwardrotation,andyaw(calledtheheadingororientation)referstothedirectioninwhichtherobotmovesinthex-yplane.Thedifferential-driverobotmovesfromx-yintheplane,sothe2Dposeconsistsmainlyofx,y,andθ,whereθistheheadoftherobotthatpointsintheforwarddirectionoftherobot.Thismuchinformationissufficienttodescribeadifferentialrobotpose.
![Page 67: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/67.jpg)
Theposeoftherobotinx,y,andθintheglobalcoordinatesystem
Inadifferential-driverobot,themotioncanbecontrolledbyadjustingthevelocityoftwoindependentlycontrolledmotorsontheleft-handsideandtheright-handside,thatis,V-leftandV-rightrespectively.Thefollowingfigureshowsacoupleofpopulardifferentialdriverobotsavailableonthemarket:
iRobot,Roomba,andPioneer3DX
Theforwardkinematicsequationsforarobotwithadifferential-drivesystemareusedtosolvethefollowingproblem:
Ifrobotisstandinginaposition(x,y,θ)attimet,determinethepose(x',y',θ')att+δtgiventhecontrolparametersV-leftandV-right.
Thistechniquecanbeusedintherobottofollowaparticulartrajectory.
Explainingoftheforwardkinematicsequation
Wecanstartbyformulatingasolutionforforwardkinematics.Thefollowingfigureisanillustrationofoneofthewheelsoftherobot:
![Page 68: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/68.jpg)
AsinglewheeloftherobotrotatingalongthelocalYaxis
Themotionaroundtheyaxisisknownasroll;everythingelsecanbeconsideredasslip.Let'sassumethatnoslipoccursinthiscase.Whenthewheelcompletesonefullrotation,thecantermovesatadistanceof2πr,whereristheradiusofthewheel.Wewillassumethatthemovementistwo-dimensional.Thismeansthatthesurfaceisflatandeven.
Whentherobotisabouttoperformarollingmotion,therobotmustrotatearoundapointthatliesalongitscommonleftandrightwheelaxes.ThepointthattherobotrotatesaroundisknownasICC-InstantaneousCenterofCurvature(ICC).Thefollowingdiagramshowsthewheelconfigurationofdifferential-drivewithICC:
Wheelconfigurationforarobotwithdifferential-drive
Thecentralconceptforthederivationofthekinematicequationistheωangularvelocityoftherobot.EachwheelontherobotrotatesaroundICCalongacirclewithawheelradiusofr.
Thespeedofthewheelisv=2πr/T,whereTisthetimetakentocompleteonefullturnaroundICC.Theωangularvelocityisdefinedas2π/Tandtypicallyhastheunitradians(ordegrees)persecond.Combiningtheequationsforvandwyieldsω=2π/T.
![Page 69: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/69.jpg)
Note
Adetailedmodelofthedifferential-drivesystemisshowninthefollowingfigure:
Ifweapplythepreviousequationinbothwheels,theresultwillbethesame,thatis,ω:
Note
Where,RisthedistancebetweenICCandthemidpointofthewheelaxisandlisthelengthofthewheelaxis.AftersolvingωandR,wegetthefollowingresult:
Note
![Page 70: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/70.jpg)
Thepreviousequationisusefulforsolvingtheforwardkinematicsproblem.Supposetherobotmoveswithanangularvelocityofωforδtseconds,itcanchangetherobot'sorientationorwhereitisheadingto:
Note
Where,thecenterofICCrotationisgivenbybasictrigonometryas:
Note
RotatingtherobotωδtdegreesaroundICC
![Page 71: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/71.jpg)
Givenastartingposition(x,y),thenewposition(x',y')canbecomputedusingthe2Drotationmatrix.TherotationaroundICCwithangularvelocityωforδtsecondsyieldsthefollowingpositionatt+δttime:
Note
Thenewpose(x',y',andθ')canbecomputedfromequations(6)and(8),givenω,δt,andR.
ωcanbecomputedfromequation(5);VrandVlareoftendifficulttomeasureaccurately.Insteadofmeasuringthevelocity,therotationofeachwheelcanbemeasuredusingasensorcalledwheelencoders.Thedatafromthewheelencodersistherobot'sodometryvalues.Thesesensorsaremountedonthewheelaxesanddeliverbinarysignalsforeachstepthewheelrotates(eachstepmaybeintheorderof0.1mm).Thesesignalsarefedtoacountersuchthatvδtisthedistancetravelledfromtimettot+δt.Wecanwrite:
n*step=vδt
Fromthis,wecancomputev:
Note
Ifweinsertequation(9)inequations(3)and(4),wegetthefollowingresult:
Note
Here,nlandnraretheencodercountsoftheleftandrightwheels.VlandVrarethespeedoftheleftandrightwheelsrespectively.Thus,therobotstandsinpose(x,y,θ)andmovesnlandnrcountsduringatimestepδt;thenewpose(x',y',θ')isgivenby:
Note
![Page 72: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/72.jpg)
where,
Note
Thederivedkinematicequationdependsmainlyonthedesignandgeometryoftherobot.Differentdesignscanleadtodifferentequations.
Inversekinematics
Theforwardkinematicsequationprovidesanupdatedposeatagivenwheelspeed.Wecannowthinkabouttheinverseproblem.
Standinpose(x,y,θ)attimetanddeterminetheV-leftandV-rightcontrolparameterssuchthattheposeattimet+δtis(x',y',θ').
Indifferential-drive,thisproblemmaynothaveasolutionbecausethiskindofrobotcan'tbemovedtoanyposebysimplysettingthewheelvelocity.It'sbecauseoftherobotconstraintscallednonholonomicrobotsthatthisproblemcanbesolved,becausethesekindsofrobotscanmovetoanypose.
Innonholonomicrobots,therearesomewaystoincreasetheconstrainedmobilityifweallowadifferentsequence(V-left,V-right).Ifweinsertvaluesfromequations(12)to(15),wecanidentifysomespecialcasesofcontrol:
IfV-right=V-left=>nr=nl=>R=∞,ωδT=0=>:Thismeanstherobotmovesinastraightlineandθremainsthesame
IfV-right=-V-left=>nr=-nl=>R=0,ωδt=2nl*step/land =>x'=x,y'=y,θ'=θ+ωδt=>:ThismeanstherobotrotatesinthepositionaroundICC,thatis,anyθisreachable,while(x,y)remainsunchanged
![Page 73: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/73.jpg)
Combiningtheseoperations,thefollowingalgorithmcanbeusedtoreachanytargetposefromthestartingpose:
1. Rotateuntiltherobot'sorientationcoincideswiththelinefromthestartingpositiontothetargetposition,V-right=-V-left=V-rot.
2. Drivestraightuntiltherobot'spositioncoincideswiththetargetposition,V-right=V-left=V-ahead.
3. Rotateuntiltherobot'sorientationcoincideswiththetargetorientation,V-right=-V-left=V-rot.
where,V-rotandV-aheadcanbechosenarbitrarily.
Note
Refertohttp://www8.cs.umu.se/~thomash/reports/KinematicsEquationsForDifferentialDriveAndArticulatedSteeringUMINF-11.19.pdfformoreinformationonkinematicsequations.
Wecanswitchtothedetailsoftoolsweareusingtosimulatethisrobot.Understandingthekinematicsoftherobotwillhelpyoutobuildthesimulationoftherobot.Italsohelpsyoutowritethesoftwarefortherobot.Thetoolswewilluseforthesimulationare:
RobotOperatingSystem(ROS)Gazebo
Thesearesomeofthepopulartoolsavailableforroboticsprogrammingandsimulation.Let'slookatthefeaturesandashortintroductionofROSandGazebo.Later,wewilldiscusshowtoperformsimulationusingthesetools.
IntroductiontoROSandGazeboROSisasoftwareframeworkforwritingrobotsoftware.ThemainaimofROSistoreusetheroboticsoftwareacrosstheglobe.ROSconsistsofacollectionoftools,libraries,andconventionsthataimtosimplifythetaskofcreatingcomplexandrobustrobotbehavioracrossawidevarietyofroboticplatforms.
TheofficialdefinitionofROSis:
ROSisanopen-source,meta-operatingsystemforyourrobot.Itprovidestheservicesyouwouldexpectfromanoperatingsystem,includinghardwareabstraction,low-leveldevicecontrol,implementationofcommonly-usedfunctionality,message-passingbetweenprocesses,andpackagemanagement.Italsoprovidestoolsandlibrariesforobtaining,building,writing,andrunningcodeacrossmultiplecomputers.ROSissimilarinsomerespectsto'robotframeworks,suchasPlayer,YARP,Orocos,CARMEN,Orca,MOOS,andMicrosoftRoboticsStudio.
Note
Refertohttp://wiki.ros.org/ROS/IntroductionformoreinformationonROS.
![Page 74: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/74.jpg)
SomeofthemainfeaturesofROSare:
DistributedFramework:ROSisadistributedframeworkthatcanrunonmultiplemachines,sothecomputationoftherobotcanbedividedoverdifferentmachines.Itcanreducetheonboardprocessingoftherobot.Codereuse:ThemaingoalofROSistoreusecode.Codereuseenablesthegrowthofagoodresearchanddevelopmentcommunityaroundtheworld.ROSexecutablesarecallednodes.TheseexecutablescanbegroupedintoasingleentitycalledROSpackages.Agroupofpackagesiscalledastackandthesestackscanbesharedanddistributed.Languageindependence:TheROSframeworkcanbeprogrammedusingpopularlanguages(suchasPython,C++,andLisp).ThenodescanbewritteninanylanguageandcancommunicatethroughROSwithoutanyissues.Easytesting:ROShasabuilt-inunit/integrationtestframeworkcalledrostesttotestROSpackages.Scaling:ROSisappropriateforlargeruntimesystemsandforlargedevelopmentprocesses.FreeandOpenSource:ThesourcecodeofROSisopenandit'sabsolutelyfreetouse.ThecorepartofROSislicensedunderBSDlicenseanditcanbereusedincommercialandclosedsourceproducts.
SomeofthemainconceptsofROSarediscussedintheupcomingsection.
ROSConcepts
TherearemainlythreelevelsofROS:
TheROSfilesystemTheROSComputationGraphTheROScommunity
TheROSfilesystem
TheROSfilesystemmainlycovershowROSfilesareorganizedonthedisk.Themaintermswehavetounderstandare:
Packages:ROSpackagesarethemainunitofanROSsoftwareframework.AROSpackagemaycontainexecutables,ROS-dependentlibrary,configurationfiles,andsoon.ROSpackagescanbereusedandshared.PackageManifests:Themanifests(package.xml)filewillhaveallthedetailsofthepackages,includingname,description,license,anddependencies.Message(msg)types:Messagedescriptionsarestoredinthemsgfolderinapackage.ROSmessagesaredatastructuresforsendingdatathroughROS.Messagedefinitionsarestoredinafilewiththe.msgextension.Service(srv)types:Servicedescriptionsarestoredinthesrvfolderwiththe.srvextension.ThesrvfilesdefinetherequestandresponsedatastructureforserviceinROS.
TheROSComputationGraph
TheROSComputationGraphisthepeer-to-peernetworkoftheROSprocessthatprocessesdatatogether.ThebasicconceptsofROSComputationGrapharenodes,ROSMaster,parameterserver,
![Page 75: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/75.jpg)
messages,andservices.
Nodes:Theseareprocessesthatperformcomputation.Forexample,onenodeofarobotpublishestheodometrydataoftherobot,anothernodepublishesthelaserscannerdata,andsoon.AnROSnodeiswrittenwiththehelpofanROSclientlibrary(suchasroscppandrospy).Wewilllookatthislibraryduringthesamplenodecreation.ROSMaster:ThisprovidesnameregistrationandalookupfortherestoftheComputationGraph.Withoutstartingthemaster,nodeswillnotfindeachothernorsendmessages.Parameterserver:Thisallowsdatatobestoredinacentrallocation.Messages:Nodescommunicatewitheachotherbypassingmessages.Amessageissimplyadatastructurecomprisingoftypedfields.Thiswillsupportdatatypes,suchasinteger,floatingpoint,Boolean,andsoon.Topics:NodesexchangedataintheformofmessagesviaROStransportsystemwithaspecificnamecalledtopics.Topicisthenameusedtoidentifythecontentofthemessage.Anodeinterestedinacertainkindofdatawillsubscribetotheappropriatetopic.Ingeneral,publishersandsubscribersarenotawareofeachother'sexistence.Theideaistodecoupletheproductionofinformationfromitsconsumption.Logically,onecanthinkofatopicasastronglytypedmessagebus.Eachbushasanameandanyonecanconnecttothebustosendorreceivemessagesaslongastheyaretherighttype.Services:Thepublish/subscribemodelisaveryflexiblecommunicationparadigm,butitsmany-to-many,one-waytransportisnotappropriateforrequest/replyinteractions,whichareoftenrequiredinadistributedsystem.Request/replyisdoneviaservices,whicharedefinedbyapairofmessagestructures:onefortherequestandoneforthereply.Aprovidingnodeoffersaserviceunderanameandaclientusestheservicebysendingtherequestmessageandawaitingthereply.ROSclientlibrariesgenerallypresentthisinteractiontotheprogrammerasifitwerearemoteprocedurecall.Bags:TheseareformatstosaveandplaybacktheROSmessagedata.Bagsareanimportantmechanismtostoredata(suchassensordata)thatcanbedifficulttocollect,butit'snecessarytodevelopandtestalgorithms.
TheROSMasteractsasanameserviceintheROSComputationGraph.ItstorestopicsandservicesregistrationinformationforROSnodes.NodescommunicatewiththeMastertoreporttheirregistrationinformation.AsthesenodescommunicatewiththeMaster,theycanreceiveinformationaboutotherregisterednodesandmakeconnectionsasappropriate.Themasterwillalsomakecallbackstothesenodeswhenthisregistrationinformationchanges,whichallowsnodestodynamicallycreateconnectionsasnewnodesarerun.
Nodesconnecttoothernodesdirectly;theMasteronlyprovidesthelookupinformation,muchlikeaDNSserver.Nodesthatsubscribetoatopicwillrequestconnectionsfromnodesthatpublishthattopicandwillestablishthatconnectionoveranagreeduponconnectionprotocol.ThemostcommonprotocolusedinanROSiscalledTCPROS,whichusesstandardTCP/IPsockets.
ThefollowingfigureshowshowtopicsandserviceworksbetweennodesandMaster:
![Page 76: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/76.jpg)
TheROScommunitylevel
TheROScommunitylevelconceptsareROSresourcesthatenableseparatecommunitiestoexchangesoftwareandknowledge.Theseresourcesinclude:
Distributions:ROSDistributionsarecollectionsofversionedstacksthatyoucaninstall.DistributionsplayasimilarroletoLinuxdistributions:theymakeiteasiertoinstallsoftwareandmaintainconsistentversionsofit.Repositories:ROSreliesonafederatednetworkofcoderepositories,wheredifferentinstitutionscandevelopandreleasetheirownrobotsoftwarecomponents.TheROSWiki:ThisisthemainforumtodocumentinformationaboutROS.Anyonecansignupforanaccountandcontributetheirowndocumentation,providecorrectionsorupdates,writetutorials,andsoon.MailingLists:Theros-usersmailinglististheprimarycommunicationchannelaboutnewupdatestoROS.ThisisalsoaforumtoaskquestionsabouttheROSsoftware.
ThereareenoughconceptstobediscussedaboutROS;youcanrefertotheROSofficialwebsiteatwww.ros.orgformoreinformation.Now,wewilllookattheinstallationprocedureofROS.
InstallingROSIndigoonUbuntu14.04.2Asperourpreviousdiscussion,weknowthatROSisametaoperatingsystemtobeinstalledonahostsystem.ROSiscompletelysupportedonUbuntu/LinuxandintheexperimentalstagesonWindowsandOSX.SomeofthelatestROSdistributionsare:
Distribution ReleasedDate
ROSIndigoIgloo July22,2014
![Page 77: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/77.jpg)
ROSHydroMedusa September4,2013
ROSGroovyGalapagos December31,2012
WewilldiscusstheinstallationprocedureofthelatestdistributionofROScalledIndigoIglooonUbuntu14.04.2LTS.ROSIndigoIgloowillbeprimarilytargetedattheUbuntu14.04LTS.IfyouareaWindowsorOSXuser,youcanpreferablyinstallUbuntuinaVirtualBoxapplicationandinstallROSonit.ThelinktodownloadVirtualBoxishttps://www.virtualbox.org/wiki/Downloads.
Theinstallationinstructionsareasfollows:
1. ConfigureyourUbunturepositoriestoallowrestricted,universe,andmultiversedownloadable.WecanconfigureitusingUbuntu'sSoftware&Updatetool.WecangetthisbytoolbysimplysearchingontheUbuntuUnitysearchmenuandtickthefollowingoptions,asshowninthefollowingscreenshot:
Ubuntu'sSoftwareandUpdatetool
2. SetupyoursystemtoacceptROSpackagesfrompackages.ros.org.ROSIndigoissupportedonlyonUbuntu13.10andUbuntu14.04.Thefollowingcommandwillstorepackages.ros.orgtoUbuntu'saptrepositorylist:
$sudosh-c'echo"debhttp://packages.ros.org/ros/ubuntutrustymain">
/etc/apt/sources.list.d/ros-latest.list'
![Page 78: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/78.jpg)
3. Next,wehavetoaddapt-keys.Theapt-keyisusedtomanagethelistofkeysusedbyapttoauthenticatethepackages.Packagesthathavebeenauthenticatedusingthesekeyswillbeconsideredtrusted.Thefollowingcommandwilladdapt-keysforROSpackages:
$wgethttps://raw.githubusercontent.com/ros/rosdistro/master/ros.key-O-|
sudoapt-keyadd-
4. Afteraddingapt-keys,wehavetoupdatetheUbuntupackageindex.ThefollowingcommandwilladdandupdatetheROSpackagesalongwiththeUbuntupackages:
$sudoapt-getupdate
5. AfterupdatingtheROSpackages,wecaninstallthepackages.ThefollowingcommandwillinstallthenecessarytoolsandlibrariesofROS:
$sudoapt-getinstallros-indigo-desktop-full
6. Wemayneedtoinstalladditionalpackagesevenafterthedesktop-fullinstallation;eachadditionalinstallationwillbementionedintheappropriatesection.Thedesktop-fullinstallwilltakesometime.AftertheinstallationofROS,youarealmostdone.Thenextstepistoinitializerosdep,whichenablesyoutoeasilyinstallthesystemdependenciesforROSsourcepackagesyouwanttocompileandisrequiredtorunsomecorecomponentsinROS:
$sudorosdepinit
$rosdepupdate
7. ToaccesstheROStoolsandcommandsonthecurrentbashshell,wecanaddROSenvironmentalvariablestothe.bashrcfile.Thiswillexecuteinthebeginningofeachbashsession.ThefollowingisacommandtoaddtheROSvariableto.bashrc:
echo"source/opt/ros/indigo/setup.bash">>~/.bashrc
Thefollowingcommandwillexecutethe.bashrcscriptonthecurrentshelltogeneratethechangeinthecurrentshell:
source~/.bashrc
8. Oneoftheusefultoolstoinstallisrosinstall.Thistoolhastobeinstalledseparately.ItenablesyoutoeasilydownloadmanysourcetreesfortheROSpackagewithonecommand:
$sudoapt-getinstallpython-rosinstall
AftertheinstallationofROS,wewilldiscusshowtocreateasamplepackageinROS.Beforecreatingapackage,wehavetobuildanROSworkspace.ThepackagesarecreatedintheROSworkspace.Wewillusethecatkinbuildsystem,asetoftoolstobuildpackagesinROS.Thecatkinbuildsystemgeneratesexecutableorsharedlibrariesfromthesourcecode.ROSIndigousesthecatkinbuildsystemtobuildpackages.Let'sseewhatcatkinis.
Introducingcatkin
CatkinistheofficialbuildsystemofROS.Beforecatkin,ROSusedtherosbuildsystemtobuildpackages.ItsreplacementiscatkinonthelatestROSversion.CatkincombinesCMakemacrosand
![Page 79: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/79.jpg)
PythonscriptstoprovidethesameCMakenormalworkflow.Catkinprovidesabetterdistributionofpackages,bettercross-compilation,andbetterportabilitythantherosbuildsystem.Formoreinformation,refertowiki.ros.org/catkin.
Catkinworkspaceisafolderwhereyoucanmodify,build,andinstallcatkinpackages.
Let'scheckhowtocreateanROScatkinworkspace.
Thefollowingcommandwillcreateaparentdirectorycalledcatkin_wsandasubfoldercalledsrc:
$mkdir-p~/catkin_ws/src
Switchdirectorytothesrcfolderusingthefollowingcommand.Wewillcreateourpackagesinthesrcfolder:
$cd~/catkin_ws/src
Initializethecatkinworkspaceusingthefollowingcommand:
$catkin_init_workspace
Afteryouinitializethecatkinworkspace,youcansimplybuildthepackage(evenifthereisnosourcefile)usingthefollowingcommand:
$cd~/catkin_ws/
$catkin_make
Thecatkin_makecommandisusedtobuildpackagesinsidethesrcdirectory.Afterbuildingthepackages,wewillseeabuildanddevelfolderincatkin_ws.Theexecutablesarestoredinthebuildfolderandinthedevelfolder,thereareshellscriptfilestoaddtotheworkspaceontheROSenvironment.
CreatinganROSpackage
Inthissection,wewillseehowtocreateasamplepackagethatcontainstwoPythonnodes.OneofthenodesisusedtopublishaHelloWorldmessageonatopic,andtheothernodewillsubscribetothistopic.
AcatkinROSpackagecanbecreatedusingthecatkin_create_pkgcommandinROS.
Thepackageiscreatedinsidethesrcfolderthatwecreatedduringthecreationofworkspace.Beforecreatingpackages,switchtothesrcfolderusingthefollowingcommand:
$cd~/catkin_ws/src
Thefollowingcommandwillcreateahello_worldpackagewithstd_msgsdependencies,whichcontainstandardmessagedefinitions.TherospyisthePythonclientlibraryforROS:
$catkin_create_pkghello_worldstd_msgsrospy
![Page 80: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/80.jpg)
Thisisthemessagewegetafterthesuccessfulcreation:
Createdfilehello_world/package.xml
Createdfilehello_world/CMakeLists.txt
Createdfolderhello_world/src
Successfullycreatedfilesin/home/lentin/catkin_ws/src/hello_world.Pleaseadjust
thevaluesinpackage.xml.
Afterthesuccessfulcreationofthehello_worldpackage,weneedtoaddtwoPythonnodesorscriptstodemonstratethesubscribingandpublishingoftopics.
First,createafoldernamedscriptsinthehello_worldpackageusingthefollowingcommand:
$mkdirscripts
Switchtothescriptsfolderandcreateascriptnamedhello_world_publisher.pyandanotherscriptcalledhello_world_subscriber.pytopublishandsubscribetothehelloworldmessage.Thefollowingsectioncoversthecodeandexplanationofthesescriptsornodes:
Hello_world_publisher.py
Thehello_world_publisher.pynodebasicallypublishesagreetingmessagecalledhelloworldtoatopic/hello_pub.Thegreetingmessageispublishedtothetopicattherateof10Hz.
Thestepbystepexplanationofthiscodeisasfollows:
1. WeneedtoimportrospyifwearewritinganROSPythonnode.ItcontainsPythonAPI'stointeractwithROStopics,services,andsoon.
2. Tosendthehelloworldmessage,wehavetoimportaStringdatatypefromthestd_msgspackage.Ithasamessagedefinitionforstandarddatatypes.Wecanimportusingthefollowingcommand:
#!/usr/bin/envpython
importrospy
fromstd_msgs.msgimportString
3. Thefollowinglineofcodecreatesapublisherobjecttoatopiccalledhello_pub.ThedatatypeisStringandqueue_sizeis10.Ifthesubscriberisnotfastenoughtoreceivethedata,wecanusethequeue_sizeoptiontoadjustit:
deftalker():
pub=rospy.Publisher('hello_pub',String,queue_size=10)
4. ThefollowinglineofcodeismandatoryforallROSPythonnodes.Itinitializesandassignsanametothenode.Thenodecannotbelauncheduntilitgetsaname.Itcancommunicatewithothernodesusingitsname.Iftwonodesarerunningwiththesamenodename,onewillshutdown.Ifwewanttorunbothnodes,usetheanonymous=Trueflagasshownhere:
rospy.init_node('hello_world_publisher',anonymous=True)
5. Thefollowinglinecreatesarateobjectcalledr.Usingasleep()methodintheRateobject,wecanupdatetheloopinadesiredrate.Here,wearegivenatrate10:
![Page 81: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/81.jpg)
r=rospy.Rate(10)#10hz
6. Thefollowingloopwillcheckwhetherrospyconstructstherospy.is_shutdown()flag.Then,itexecutestheloop.IfweclickonCtr+C,thisloopwillexit.
Insidetheloop,ahelloworldmessageisprintedontheterminalandpublishedonthehello_pubtopicwitharateof10Hz:
whilenotrospy.is_shutdown():
str="helloworld%s"%rospy.get_time()
rospy.loginfo(str)
pub.publish(str)
r.sleep()
7. InadditiontothestandardPython__main__check,thefollowingcodecatchesarospy.ROSInterruptExceptionexception,whichcanbethrownbytherospy.sleep()method,andtherospy.Rate.sleep()method,whenCtrl+Cisclickedonoryournodeisotherwiseshutdown.Thereasonthisexceptionisraisedissothatyoudon'taccidentallycontinueexecutingcodeafterthesleep()method:
if__name__=='__main__':
try:
talker()
exceptrospy.ROSInterruptException:pass
Afterpublishingthetopic,wewillseehowtosubscribeit.Thefollowingsectioncoversthecodetosubscribethehello_pubtopic.
Hello_world_subscriber.py
Thesubscribercodeisasfollows:
#!/usr/bin/envpython
importrospy
fromstd_msgs.msgimportString
Thefollowingcodeisacallbackfunctionthatisexecutedwhenamessagereachesthehello_pubtopic.Thedatavariablecontainsthemessagefromthetopicanditwillprintusingrospy.loginfo():
defcallback(data):
rospy.loginfo(rospy.get_caller_id()+"Iheard%s",data.data)
Thefollowingsectionwillstartthenodewithahello_world_subscribernameandstartsubscribingtothe/hello_pubtopic.
1. ThedatatypeofthemessageisStringandwhenamessagearrivesonthistopic,amethodcalledcallbackwillbecalled:
deflistener():
![Page 82: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/82.jpg)
rospy.init_node('hello_world_subscriber',anonymous=True)
rospy.Subscriber("hello_pub",String,callback)
2. Thiswillkeepyournodefromexitinguntilthenodeisshutdown:
rospy.spin()
3. ThefollowingisthemaincheckofthePythoncode.Themainsectionwillcallthelistener()method,whichwillsubscribetothe/hello_pubtopic:
if__name__=='__main__':
listener()
4. AftersavingtwoPythonnodes,youneedtochangethepermissiontoexecutableusingthechmodcommands:
chmod+xhello_world_publisher.py
chmod+xhello_world_subscriber.py
5. Afterchangingthefilepermission,buildthepackageusingthecatkin_makecommand:
cd~/catkin_ws
catkin_make
6. FollowingcommandaddsthecurrentROSworkspacepathinallterminalsothatwecanaccesstheROSpackagesinsidethisworkspace:
echo"source~/catkin_ws/devel/setup.bash">>~/.bashrc
source~/.bashrc
Thefollowingistheoutputofthesubscriberandpublishernodes:
1. First,weneedtorunroscorebeforestartingthenodes.TheroscorecommandorROSmasterisneededtocommunicatebetweennodes.So,thefirstcommandis:
$roscore
2. Afterexecutingroscore,runeachnodeusingthefollowingcommands.3. Thefollowingcommandwillrunthepublisher:
$rosrunhello_worldhello_world_publisher.py
4. Thefollowingcommandwillrunthesubscribernode.Thisnodesubscribestothehello_pubtopic,asshowninthefollowingscreenshot:
$rosrunhello_worldhello_world_subscriber.py
![Page 83: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/83.jpg)
WecoveredsomebasicsofROS.Now,wewillseewhatisGazeboisandhowwecanworkwithGazebousingROS.
IntroducingGazebo
Gazeboisafreeandopensourcerobotsimulatorinwhichwecantestalgorithms,designrobots,andperformregressiontestingusingrealisticscenarios.Gazebocanaccuratelyandefficientlysimulateapopulationofrobotsincomplexindoorandoutdoorenvironments.Gazeboisbuiltinarobustphysicsenginewithhighqualitygraphicsandaconvenientprogrammaticandgraphicalinterface.
ThefeaturesofGazeboareasfollows:
Dynamicsimulation:GazebocansimulatedynamicsofarobotusingaphysicsenginesuchasOpenDynamicsEngine(ODE).(http://opende.sourceforge.net/),Bullet(http://bulletphysics.org/wordpress/),Simbody(https://simtk.org/home/simbody/),andDART(http://dartsim.github.io/).Advanced3DGraphics:Gazeboprovideshighqualityrendering,lighting,shadows,andtexturingusingtheOGREframework(http://www.ogre3d.org/).Sensorssupport:Gazebosupportsawiderangeofsensors,includinglaserrangefinders,kinectstylesensors,2D/3Dcamera,andsoon.Wecansimulateeitherwithnoiseorwithoutnoise.Plug-in:Wecandevelopcustompluginsfortherobot,sensor,andenvironmentalcontrol.PluginscanaccessGazebo'sAPI.RobotModels:Gazeboprovidesmodelsforpopularrobots,suchasPR2,Pioneer2DX,iRobotCreate,andTurtleBot.Wecanalsobuildcustommodelsofrobots.TCP//IPTransport:WecanrunsimulationonaremotemachineandaGazebointerface
![Page 84: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/84.jpg)
throughasocket-basedmessagepassingservice.CloudSimulation:WecanrunsimulationontheCloudserverusingtheCloudSimframework(http://cloudsim.io/).CommandLineTools:Extensivecommand-linetoolsareusedtocheckandlogsimulation.
InstallingGazebo
GazebocanbeinstalledasastandaloneapplicationoranintegratedapplicationalongwithROS.Inthischapter,wewilluseGazeboalongwithROSforsimulationandtotestourwrittencodeusingtheROSframework.
IfyouwanttotrythelatestGazebosimulatorindependently,youcanfollowtheproceduregivenathttp://gazebosim.org/download.
ToworkwithGazeboandROS,wedon'tneedtoinstallitseparatelybecauseGazeboisbuilt-inalongwiththeROSdesktop-fullinstallation.
TheROSpackageintegratesGazebowithROSnamedgazebo_ros_pkgs,whichhascreatedwrappersaroundastandaloneGazebo.ThispackageprovidesthenecessaryinterfacetosimulatearobotinGazebousingROSmessageservices.
ThecompleteGazebo_ros_pkgscanbeinstalledinROSIndigousingthefollowingcommand:
$sudoapt-getinstallros-indigo-gazebo-ros-pkgsros-indigo-gazebo-ros-control
TestingGazebowiththeROSinterface
AssumingthattheROSenvironmentisproperlysetup,wecanstartroscorebeforestartingGazebousingthefollowingcommand:
$roscore
ThefollowingcommandwillrunGazebousingROS:
$rosrungazebo_rosgazebo
Gazeboisrunningastwoexecutables,thatis,theGazeboserverandtheGazeboclient.TheGazeboserverwillexecutethesimulationprocessandtheGazeboclientcanbetheGazeboGUI.Usingthepreviouscommand,theGazeboclientandserverwillruninparallel.
TheGazeboGUIisshowninthefollowingscreenshot:
![Page 85: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/85.jpg)
AfterstartingGazebo,wewillseethefollowingtopicsgenerated.Usingtherostopiccommand,wewillfindthefollowinglistoftopics:
$rostopiclist
/gazebo/link_states
/gazebo/model_states
/gazebo/parameter_descriptions
/gazebo/parameter_updates
/gazebo/set_link_state
/gazebo/set_model_state
Wecanruntheserverandclientseparatelyusingthefollowingcommand:
RuntheGazeboserverusingthefollowingcommand:
$rosrungazebo_rosgzserver
RuntheGazeboclientusingthefollowingcommand:
$rosrungazebo_rosgzclient
WehaveinstalledthebasicpackagesofGazeboinROS.Ifyouarenotplanningtobuildthehardwareforthisrobot,thealternativeplanistobuyanotherrobotcalledTurtleBot(http://store.clearpathrobotics.com/products/turtlebot-2).Now,wewillseehowtoinstalltheTurtleBotstackonROS.
InstallingTurtleBotRobotpackagesonROSIndigo
![Page 86: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/86.jpg)
TheTurtleBotinstallationprocedurefromitssourceismentionedathttp://wiki.ros.org/Robots/TurtleBot.
ThefollowingisaquickproceduretoinstalltheTurtleBotstackanditsdependenciesfromtheaptpackagemanager:
1. First,youneedtoinstallthesynapticpackagemanagerusingthefollowingcommand.Synapticisagraphicalpackagemanagementprogramforapt.Itprovidesthesamefeaturesastheapt-getcommand-lineutilitywithaGUIfrontendbasedonGTK+:
$sudoapt-getinstallsynaptic
2. Aftertheinstallationofthesynapticpackagemanager,openitandfilteritssearchesusingtheros-indigo-roconkeyword.
3. Installallthepackageslistedonsynaptic,asshowninthefollowingscreenshot:
Rocon,alsoknownasroboticsinconcert,isadependencyoftheTurtleBotstack.ThispackagemainlyaimstobringROStomultirobotdevicetablets.Youcanreadmoreaboutroconathttp://wiki.ros.org/rocon.
Aftertheinstallationofrocon,weneedtoinstallanotherdependencycalledthekobukipackage.KobukiisasimilarroboticmobileplatformfromYujinRobots(http://wiki.ros.org/kobuki).TurtleBotpackagesaredependentonthesepackages.
Theros-indigo-kobukipackagecanbeinstalledusingsynaptic,suchastheroconpackage.The
![Page 87: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/87.jpg)
followingisascreenshotoftheinstallation:
FollowingisthestepbystepproceduretobuildandinstallTurtleBot'slatestROSpackagesfromthesourcecodeinROSIndigo.Thedependenciesforinstallingthesepackagesarealreadymetinthepreviousprocedure.
1. Createafoldercalledturtlebotinthehomefolderusingthefollowingcommand:
$mkdir~/turtlebot
2. Switchthedirectorytoturtlebotusingthefollowingcommand:
$cd~/turtlebot
3. DownloadthelatestsourcecodeofTurtleBotusingthefollowingcommand:
$wstoolinitsrc-j5
https://raw.github.com/yujinrobot/yujin_tools/master/rosinstalls/indigo/turtleb
ot.rosinstall
4. Installallthedependenciesofthesourcecodeusingthefollowingcommand:
$rosdepinstall--from-pathssrc-i-y
5. Buildthesourcecodeusing:
$catkin_make
6. ToaccessTurtleBotpackagesfromallterminals,wehavetoaddsource
![Page 88: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/88.jpg)
~/turtlebot/devel/setup.bashcommandtothe.bashrcfile.Thefollowingcommandwilldothisjob:
$echo"source~/turtlebot/devel/setup.bash">>~/.bashrc
7. Thiscommandwillexecutethe.bashrcfile:
$source~/.bashrc
InstallingTurtleBotROSpackagesusingtheaptpackagemanagerinUbuntu
IfyouwanttoinstallTurtleBotpackageswithoutcompilingsourcecode,wecanuseaptpackagemanager.ThefollowingisthecommandtoinstallTurtleBotpackagesinROS:
$sudoapt-getinstallros-indigo-turtlebotros-indigo-turtlebot-appsros-indigo-
turtlebot-interactionsros-indigo-turtlebot-simulatorros-indigo-kobuki-ftdiros-
indigo-rocon-remocon
Let'scheckhowtosimulateTurtleBotinGazeboandmovetherobotonanemptyenvironment.
SimulatingTurtleBotusingGazeboandROS
TheTurtleBotsimulatorpackagecontainstheturtlebot_gazebopackagestosimulateTurtleBotonGazebo.
AfterthesuccessfulinstallationoftheTurtleBotpackage,wecanenterthefollowingcommandtobringuptheTurtleBotsimulationusingROSandGazebo:
$roslaunchturtlebot_gazeboturtlebot_empty_world.launch
Onanotherterminal,runthefollowingcommand.ThiswillexecuteaPythonscripttocontrolTurtleBotusingakeyboard;thisiscalledkeyboardteleoperation:
$roslaunchturtlebot_teleopkeyboard_teleop.launch
Followingisthescreenshotoftheoutput:
![Page 89: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/89.jpg)
Thetop-leftterminalexecutesthesimulationcommandandthebottom-leftwindowexecutestheteleopcommand.
Wecanmovearoundtherobotwithkeyboardteleoperationusingthekeysmentionedonthescreen.Wecanalsomonitorthevaluesfromthemodelusingtherostopicscommand.Wecanviewthecurrenttopicsusingthefollowingcommand:
$rostopiclist
Itpublishesallthevaluesfromthesensors,suchasthekinectsensor,theodometryvaluesfromwheelencoders,theIMUsensorvalueforodometry,andGazebo'sstatevalues.
WewillusethecloneoftheTurtleBotpackageinwhichtherobotmodelandsimulationparametersaredifferent.Wecanperformthiscloningformostofthemobilerobotthathasadifferentialsteeringsystem.WewillcreatethepackagesforourrobotbycloningtheTurtleBotcode.Wewillnameourcustomrobotaschefbotinsteadofturtlebot,andallourpackageswillbenamedaccordingtothis.
CreatingtheGazebomodelfromTurtleBotpackages
InTurtleBotpackages,thesimulationandkinematicmodelsareimplementedusingtwopackages,thatis,turtlebot_gazeboandturtlebot_description.Theturtlebot_gazebopackagehasfilestolaunchsimulationinGazebo.Theturtlebot_descriptionpackagecontainstheGazeboandthekinematicmodeloftherobot.
![Page 90: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/90.jpg)
Wecustomizedandreusedtheturtlebotpackagesandrecreatedthesamepackagesforourrobot.Wenamedourrobotaschefbot;wecancreatechefbot_gazebo,whichcontainsthesimulationlaunchfiles.ThelaunchfilesinROSareakindofanXMLfileinwhichwecanlaunchmultiplenodesandsetmultipleparametersbyrunningasinglefile.Torunthelaunchfile,wehavetousetheroslaunchcommand.
Note
WecancheckouttheimplementedROSpackagesofChefBotathttp://wiki.ros.org/roslaunchforreference.
ThefollowingcommandclonesthecompleteROSpackagesoftheChefBot:
$gitclonehttps://github.com/qboticslabs/Chefbot_ROS_pkg.git
Thechefbot_descriptionpackagecontainsthekinematicmodelandtheGazebomodeloftherobot.Thefollowingfigureshowsthevariousfilesinthesetwopackages:
TheChefbot_descriptionandtheChefbot_Gazebopackage
Switchtocatkin_ws,whichwecreatedtodevelopROSpackages.Inthesrcfolder,firstcreateafoldercalledchefbot.Then,wecancreateallthepackagesofChefBotinit,usingthefollowingcommands:
$cd~/catkin_ws/src
![Page 91: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/91.jpg)
$mkdirchefbot
$cdchefbot
Thechefbot_gazebocanbecreatedusingthefollowingcommandwiththerequireddependencies:
$catkin_create_pkgchefbot_gazebodepthimage_to_laserscandiagnostic_aggregator
gazebo_roskobuki_gazebo_pluginsrobot_pose_ekfrobot_state_publisherxacro
yocs_cmd_vel_muxcreate_gazebo_pluginscreate_descriptioncreate_drivercreate_node
Aftercreatingthepackage,youcancopythetwofoldersinchefbot_gazebofromthesourcecodeofthechapterthatcanbedownloadedfromthePacktPublishingwebsite.Thiscodeisadaptedfromtheturtlebot_gazebopackage;youcanalsorefertoitscodeforfurtherreference.
Hereistheexplanationofeachfileusage.First,wediscussedthechefbot_gazebopackage.Inthelaunchfolder,therearelaunchfilesforeachfunctionality:
chefbot_empty_world.launch:ThisfilewilllaunchtheChefBotmodelinGazebowithanemptyworld,wheretheworldisaGazebofilecontaininginformationabouttherobotenvironment.chefbot_playground.launch:ThisfilewilllaunchtheChefbotmodelinGazebo.ThesimulatedGazeboenvironmentcontainssomerandomobjectslikecylindersandboxes.gmapping_demo.launch:ThisfilewillstartSimultaneousLocalizationAndMapping(SLAM).UsingSLAM,wecanmaptheenvironmentandstoreitforfutureuse.Inourcase,wecanmapahotelenvironmentusingthispackage.Wewilldiscussmoreongmappingintheupcomingchapter.FormoreinformationonSLAM,refertohttp://wiki.ros.org/gmapping.amcl_demo.launch:AMCLstandsforAdaptiveMonteCarloLocalization(http://wiki.ros.org/amcl).Aftermappingtheenvironment,therobotcanautonomouslynavigatebylocalizingitselfonthemapandalsobygivingthefeedbackfromwheels.Thefeedbackfromtherobotiscalledodometry.ThelocalizationalgorithmAMCLandthenavigationalgorithm,suchaspathplanningisperformedinthislaunchfile.chefbot_base.launch.xml:ThisXMLfilewillparseanxacrofilecalledchefbot_circles_kinect.urdf.xacrotoURDFpresentinthechefbot_descriptionfolder.AfterconvertingthexacrofiletoURDF,itwillgeneratetherobotmodelequivalenttoROS.WewilllearnmoreaboutURDFandxacroafterthissection.
AftergeneratingtherobotmodelinURDF,thisfilewillgeneratetheGazebo-compatiblemodelfromtheURDFrobotdescription.Also,itwillstartavelocitymuxernodethatwillprioritizethecommandvelocityoftherobot.Anexampleofcommandvelocityistheteleoperationbykeyboardorjoystick.Accordingtothepriorityassigned,thecommandvelocitywillreachtherobot.Let'sdiscussmoreonURDFandxacrotogetaclearpictureofthedescriptionoftherobot.
Whatisarobotmodel,URDF,xacro,androbotstatepublisher?
RobotmodelinROScontainspackagestomodelthevariousaspectsoftherobot,whichisspecifiedintheXMLRobotDescriptionFormat(URDF).ThecorepackageofthisstackisURDF,whichparsesURDFfilesandconstructsanobjectmodeloftherobot.
UnifiedRobotDescriptionFormat(URDF)isanXMLspecificationtodescribethemodelofarobot.WecanrepresentthefollowingfeaturesoftherobotusingURDF:
![Page 92: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/92.jpg)
ThekinematicanddynamicdescriptionoftherobotThevisualrepresentationoftherobotThecollisionmodeloftherobot
Thedescriptionoftherobotconsistsofasetoflink(part),elements,andasetofjointelements,whichconnecttheselinkstogether.Atypicalrobotdescriptionisshowninthefollowingcode:
<robotname="chefbot">
<link>...</link>
<link>...</link>
<link>...</link>
<joint>....</joint>
<joint>....</joint>
<joint>....</joint>
</robot>
Note
ItwillbegoodifyourefertothefollowinglinksformoreinformationonURDF:
http://wiki.ros.org/urdf
http://wiki.ros.org/urdf/Tutorials
Xacro(XMLMacros)isanXMLmacrolanguage.Withxacro,wecancreateshorterandreadableXMLfiles.WecanusexacroalongwithURDFtosimplifytheURDFfile.Ifweaddxacrotourdf,wehavetocalltheadditionalparsernodetoconvertxacrotourdf.
Note
Thefollowinglinkcangiveyoumoreideaaboutxacro:
http://wiki.ros.org/xacro
robot_state_publisherallowsyoutopublishthestateoftherobottotf(http://wiki.ros.org/tf).Oncethestategetspublished,it'savailabletoallthecomponentsinthesystemthatalsousetf.Thepackagetakesthejointanglesoftherobotasinputandpublishesthe3Dposesoftherobotlinksusingthekinematictreemodeloftherobot.ThepackagecanbeusedasalibraryandasanROSnode.Thispackagehasbeenwelltestedandthecodeisstable.Nomajorchangesareplannedinthenearfuture.
Worldfiles:TheserepresenttheenvironmentofGazebo,whichhavetobeloadedalongwiththerobot.Theempty.worldandplayground.worldworldfilesareincludedinthelaunchfiles,soitwillloadwhenGazebostarts.CMakeList.txtandpackage.xml:Thesefilesarecreatedduringthecreationofpackage.CmakeList.txtfilehelpstobuildthenodesorlibrarieswithinapackageandthepackage.xmlfileholdsthelistofallthedependenciesofthispackage.
CreatingaChefBotdescriptionROSpackage
Thechefbot_descriptionpackagecontainstheurdfmodelofourrobot.Beforecreatingthis
![Page 93: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/93.jpg)
packagebyyourown,youcangothroughthedownloadedpackagesofChefBot.Itcanhelpyoutospeeduptheprocess.
Let'scheckhowtocreatethechefbot_descriptionpackage.Followingprocedurewillguideyouincreatingthispackage:
1. First,weneedtoswitchtothechefbotfolderinthesrcfolder:
$cd~/catkin_ws/src/chefbot
2. Thefollowingcommandwillcreatetherobotdescriptionpackagealongwithdependencies,suchasurdf,xacro,andthedescriptionpackageofKobuki,andcreatemobilerobots:
$catkin_create_pkgchefbot_descriptionurdfxacrokobuki_description
create_description
3. Copythemeshes,urdf,androbotsfoldersfromthedownloadedsourcetothepackagefolder.Themeshfolderholdsthe3Dpartsoftherobotandtheurdffoldercontainstheurdfdescriptionandthesensorsoftherobot.Theentirerobotmodelisdividedintoasetofxacrofilesforeasierdebuggingandbetterreadability.
Let'sseethefunctionalityofeachfilesinsidethispackage.Youcanreferthedownloadedsourcecodeforcheckingthesefiles,andyoucanalsocopythesefilesfromthedownloadedfilestothenewlycreatedfolder.Thefunctionalityofeachurdffolderisasfollows:
chefbot_base.urdf.xacro:Thisxacrorepresentsthekinematicmodeloftheentirerobot.ItmodelstheentirejointsoftherobotusingtheURDFtags.Thejointincludestwowheels,twocasterwheels,gyrosensors,andsoon.The3Dkinectsensorisnotmodeledinthisfile.Itwillalsoattachmeshestoeachlinks.ThisfileisreusedfromtheKobukimobile-basedpackage.chefbot_base_gazebo.urdf.xacro:ThisistheGazebomodelrepresentationofeachlinkoftherobot.Itincludestheactuatordefinition,sensordefinitions,theparametersettingofthedifferentialrobot,andsoon.Gazebousesthisvaluetoperformthesimulation.Therobotparameterscanchangebychangingthevaluesinthisfile.chefbot_gazebo.urdf.xacro:ThepreviousGazebourdfdoesnothavethedefinitionsofthe3Dsensorkinect.Thisfilestartsthekinect_openniGazeboplugintosimulatethekinectsensorontherobot.chefbot_library.urdf.xacro:Thisfileincludesallthexacrofilesandsensorsoftherobot.Thissinglefilecanlaunchallthedescriptionsoftherobot.chefbot_properties.urdf.xacro:Thisfileincludesthe3Dkinectsensorpositionontherobotmodel.common_properties.urdf.xacro:Thisfilecontainspropertiesofmeshessuchascolor.kinect.urdf.xacro:ThisfilecontainstheGazeboparameterofkinectandispresentinsidethesensorsfolder.Thisfileisincludedinthechefbot_gazebo.urdf.xacroandchefbot_properties.urdf.xacrofilestosetthekinectparameters.chefbot_circles_kinect_urdf.xacro:Thisfileisinsidetherobotfolder.Itincludesthechefbot_library.urdf.xacrofile,whichwillloadalltherobotdescriptionfilesneededtostartthesimulation.
![Page 94: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/94.jpg)
Inthemeshesfolder,wemainlyhavethewheelandthebodyoftherobot,the3DmodelpartsofChefBot.
SimilartoTurtleBot,wecanlaunchtheChefBotsimulationusingthefollowingcommand:
$roslaunchchefbot_gazebochefbot_empty_world.launch
Whenweexecutethiscommand,launchfileswillexecuteintheorder,asshowninthefollowingscreenshot:
Wehavealreadyseenthefunctionalityofeachfile.Theimportantfilesweneedtodiscussare:
chefbot_gazebo.urdf.xacro
kinect.urdf.xacro
chefbot_base.urdf.xacro
chefbot_base_gazebo.urdf.xacro
![Page 95: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/95.jpg)
Let'stakealookatchefbot_base_gazebo.urdf.xacro.Theactualfiledefinitionisprettylong,sowewillonlydiscusstheimportantparts.
BeforediscussingGazebodefinitions,wecanrefertotheGazebotagsparametersmentionedinURDF.ThevarioustagsthatcanbeusedintheURDFcanbefoundathttp://osrf-distributions.s3.amazonaws.com/sdformat/api/1.5.html.
TheGazebodefinitionforeachlinkismentionedinURDFas<gazebo></gazebo>.ThefollowingURDFdefinitionsforindividualjointsoftherobotaremodeledusingtheGazeboparameters.Thejointsincludewheeljointsandcasterwheeljoints.Themu1andmu2parametersarecoefficientsoffriction.Kpandkdindicatethedynamicalstiffnessanddampingofajoint.MinDepthistheminimumallowabledepthbeforethecontactcorrectionimpulseisapplied.MaxVelisthemaximumcontactcorrectionvelocitytruncationterm:
<?xmlversion="1.0"?>
<robotname="kobuki_sim"xmlns:xacro="http://ros.org/wiki/xacro">
<xacro:macroname="kobuki_sim">
<gazeboreference="wheel_left_link">
<mu1>1.0</mu1>
<mu2>1.0</mu2>
<kp>1000000.0</kp>
<kd>100.0</kd>
<minDepth>0.001</minDepth>
<maxVel>1.0</maxVel>
</gazebo>
<gazeboreference="wheel_right_link">
<mu1>1.0</mu1>
<mu2>1.0</mu2>
<kp>1000000.0</kp>
<kd>100.0</kd>
<minDepth>0.001</minDepth>
<maxVel>1.0</maxVel>
</gazebo>
<gazeboreference="caster_front_link">
<mu1>0.0</mu1>
<mu2>0.0</mu2>
<kp>1000000.0</kp>
<kd>100.0</kd>
<minDepth>0.001</minDepth>
<maxVel>1.0</maxVel>
</gazebo>
<gazeboreference="caster_back_link">
<mu1>0.0</mu1>
<mu2>0.0</mu2>
<kp>1000000.0</kp>
<kd>100.0</kd>
<minDepth>0.001</minDepth>
<maxVel>1.0</maxVel>
</gazebo>
![Page 96: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/96.jpg)
ThefollowingsectionisusedfortheInertialMeasurementUnit(IMU)sensorintherobot(http://en.wikipedia.org/wiki/Inertial_measurement_unit)modeledinGazebo.ThemainuseofIMUintherobotistogenerateagoododometryvalue:
<gazeboreference="gyro_link">
<sensortype="imu"name="imu">
<always_on>true</always_on>
<update_rate>50</update_rate>
<visualize>false</visualize>
<imu>
<noise>
<type>gaussian</type>
<rate>
<mean>0.0</mean>
<stddev>${0.0014*0.0014}</stddev><!--0.25x0.25(deg/s)-->
<bias_mean>0.0</bias_mean>
<bias_stddev>0.0</bias_stddev>
</rate>
<accel><!--notusedinthepluginandrealrobot,henceusingtutorial
values-->
<mean>0.0</mean>
<stddev>1.7e-2</stddev>
<bias_mean>0.1</bias_mean>
<bias_stddev>0.001</bias_stddev>
</accel>
</noise>
</imu>
</sensor>
</gazebo>
Thedifferential-drivecontrollerpluginforGazeboisgiveninthefollowingcode.WewillreusetheKobukidifferential-drivepluginforthedrivesystem.Wewillalsomentionthemainmeasurementsoftherobot,suchasthewheelseparation,wheeldiameter,torqueofthemotor,andsoon,inthissection.Thissectionwillalsoincludethecliffsensor,whichwillnotbeusedinourmodel.Wemayignorethissectionifwedon'twanttouseit:
<gazebo>
<pluginname="kobuki_controller"filename="libgazebo_ros_kobuki.so">
<publish_tf>1</publish_tf>
<left_wheel_joint_name>wheel_left_joint</left_wheel_joint_name>
<right_wheel_joint_name>wheel_right_joint</right_wheel_joint_name>
<wheel_separation>.30</wheel_separation>
<wheel_diameter>0.09</wheel_diameter>
<torque>18.0</torque>
<velocity_command_timeout>0.6</velocity_command_timeout>
<cliff_sensor_left_name>cliff_sensor_left</cliff_sensor_left_name>
<cliff_sensor_center_name>cliff_sensor_front</cliff_sensor_center_name>
<cliff_sensor_right_name>cliff_sensor_right</cliff_sensor_right_name>
<cliff_detection_threshold>0.04</cliff_detection_threshold>
<bumper_name>bumpers</bumper_name>
<imu_name>imu</imu_name>
</plugin>
</gazebo>
kinect.urdf.xacro
![Page 97: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/97.jpg)
Thisfilemainlyhasthedefinitionsofjointsandlinksofthekinectsensor.Thisfilealsoincludestwolaunchfiles:
<xacro:includefilename="$(find
chefbot_description)/urdf/chefbot_gazebo.urdf.xacro"/>
<xacro:includefilename="$(find
chefbot_description)/urdf/chefbot_properties.urdf.xacro"/>
Thechefbot_gazebo.urdf.xacrofileconsistsofthekinectpluginforGazebo.WewillreusethispluginfromTurtleBot.Thekinectpluginisactuallythelibgazebo_ros_openni_kinect.sofile;wecanalsodefinetheparametersofKinect,asshowninthefollowingcode:
<pluginname="kinect_camera_controller"
filename="libgazebo_ros_openni_kinect.so">
<cameraName>camera</cameraName>
<alwaysOn>true</alwaysOn>
<updateRate>10</updateRate>
<imageTopicName>rgb/image_raw</imageTopicName>
<depthImageTopicName>depth/image_raw</depthImageTopicName>
<pointCloudTopicName>depth/points</pointCloudTopicName>
<cameraInfoTopicName>rgb/camera_info</cameraInfoTopicName>
<depthImageCameraInfoTopicName>depth/camera_info</depthImageCameraInfoTopicName>
<frameName>camera_depth_optical_frame</frameName>
<baseline>0.1</baseline>
<distortion_k1>0.0</distortion_k1>
<distortion_k2>0.0</distortion_k2>
<distortion_k3>0.0</distortion_k3>
<distortion_t1>0.0</distortion_t1>
<distortion_t2>0.0</distortion_t2>
<pointCloudCutoff>0.4</pointCloudCutoff>
</plugin>
chefbot_base.urdf.xacro
Thisfiledefinesthelinksandjointsoftherobotandalsoincludesthechefbot_gazebo.urdf.xacrofile.Thejointsoftherobotarewheels,casterwheels,andsoon.HereistheXMLdefinitionofthebodyoftherobot,thewheeloftherobot,andthecasterwheeloftherobot.
Thebaselinkoftherobotincludestherobotbody,excludingthewheels.Wecanexporttherobotbodymeshfromtheblenderandexportittothe.DAEextensionusingMeshLab(http://en.wikipedia.org/wiki/COLLADA).Thebasejointisafixedtype.Thereisnomovementonthebaseplate.Wecandefinethecollisionprofileandinertiaforeachlinkoftherobot.ThesefilesarereusedfromTurtleBot,asshowninthefollowingcode:
<xacro:macroname="kobuki">
<linkname="base_footprint"/>
<!--
Baselinkissetatthebottomofthebasemould.
Thisisdonetobecompatiblewiththewaybaselink
wasconfiguredforturtlebot1.Referto
https://github.com/turtlebot/turtlebot/issues/40
![Page 98: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/98.jpg)
Toputthebaselinkatthemoreoftusedwheel
axis,setthez-distancefromthebase_footprint
to0.352.
-->
<jointname="base_joint"type="fixed">
<originxyz="000.0102"rpy="000"/>
<parentlink="base_footprint"/>
<childlink="base_link"/>
</joint>
<linkname="base_link">
<visual>
<geometry>
<!--newmesh-->
<meshfilename="package://chefbot_description/meshes/base_plate.dae"/>
</geometry>
<!--<originxyz="0.00100.05199"rpy="00${M_PI/2}"/>-->
<originxyz="0.0010-0.034"rpy="00${M_PI/2}"/>
</visual>
<collision>
<geometry>
<cylinderlength="0.10938"radius="0.178"/>
</geometry>
<originxyz="0.000.05949"rpy="000"/>
</collision>
<inertial>
<!--COMexperimentallydetermined-->
<originxyz="0.0100"/>
<massvalue="2.4"/><!--2.4/2.6kgforsmall/bigbatterypack-->
<!--Kobuki'sinertiatensorisapproximatedbyacylinderwithhomogeneous
massdistribution
Moredetails:
http://en.wikipedia.org/wiki/List_of_moment_of_inertia_tensors
m=2.4kg;h=0.09m;r=0.175m
ixx=1/12*m*(3*r^2+h^2)
iyy=1/12*m*(3*r^2+h^2)
izz=1/2*m*r^2
-->
<inertiaixx="0.019995"ixy="0.0"ixz="0.0"
iyy="0.019995"iyz="0.0"
izz="0.03675"/>
</inertial>
</link>
<!--Oneofthewheeljointisgivenbelow.Thekindofjointusedhereisa
continuousjoint.-->
<jointname="wheel_left_joint"type="continuous">
<parentlink="base_link"/>
<childlink="wheel_left_link"/>
<originxyz="0${0.28/2}0.026"rpy="${-M_PI/2}00"/>
<axisxyz="001"/>
</joint>
<linkname="wheel_left_link">
<visual>
<geometry>
![Page 99: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/99.jpg)
<meshfilename="package://chefbot_description/meshes/wheel.dae"/>
</geometry>
<originxyz="000"rpy="000"/>
</visual>
<collision>
<geometry>
<cylinderlength="0.0206"radius="0.0352"/>
</geometry>
<originrpy="000"xyz="000"/>
</collision>
<inertial>
<massvalue="0.01"/>
<originxyz="000"/>
<inertiaixx="0.001"ixy="0.0"ixz="0.0"
iyy="0.001"iyz="0.0"
izz="0.001"/>
</inertial>
</link>
SimulatingChefBotandTurtleBotinahotelenvironmentAfterdiscussingeachfile,wecantrythesimulationoftworobotsinahotelenvironment.Theproceduresandscreenshotsofthesimulationareasfollows.
SimilartoTurtleBot,wecanstartChefBotusingthefollowingcommands:
$roslaunchchefbot_gazebochefbot_empty_world.launch
ItwillshowtherobotinGazebo,asshowninthefollowingscreenshot:
![Page 100: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/100.jpg)
WecanexitGazeboandstartbuildingthehotelenvironmentfortherobot.
Thefirstprocedureistocreateaworldfileandsaveitwiththe.worldfileextension.Atypicalhotelenvironmentwithnineblock-liketablesisshowninthefollowingscreenshot.YoucantakeanemptyGazeboworldusingthefollowingcommandandmakeanenvironmentusingthebasicshapesavailableinGazebo:
1. Startroscoreusingthefollowingcommand:
$roscore
2. StartGazebowithanemptyworldusingthefollowingcommand:
$rosrungazebo_rosgazebo
3. Nowwehavecreatedanenvironment,asshowninthefollowingscreenshot,andsaveditasempty.world.
4. Copyempty.worldtotheworldfolderinthechefbot_decriptionpackage.Starttherobotwiththisenvironmentusingthefollowingcommand:
$roslaunchchefbot_gazebochefbot_empty_world.launch
ThiswillbringthesameenvironmentthatwecreatedbeforealongwithChefBot.TheprocedureremainsthesameforTurtleBot.
Insteadofchefbot_description,wehavetocopytheturtlebot_descriptionfolderforTurtleBotusers:
1. Startthegmappinglaunchfiletostartmappingthisarea.Wecanusethefollowingcommandtolaunchthegmappingprocess:
$roslaunchchefbot_gazebogmapping_demo.launch
2. InTurtleBot,usethefollowingcommand:
$roslaunchturtlebot_gazebogmapping_demo.launch
![Page 101: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/101.jpg)
3. Itwillstartthegmappingprocessandifwewanttoviewthemappingprocess,startrviz,atoolinROStovisualizesensordata(http://wiki.ros.org/rviz).ThecommandisthesameasitwasforTurtleBot:
$roslaunchturtlebot_rviz_launchersview_navigation.launch
Thescreenshotofrvizisasfollows:
Tocreateamapoftheroom,wehavetostartthekeyboardteleoperation:
1. Usingthekeyboardteleoperationfunction,wecanmovetherobotusingthekeyboardsothatitcanmaptheentirearea:
$roslaunchturtlebot_teleopkeyboard_teleop.launch
2. ThecommandissameforTurtleBot.Acompletemapofthesurroundingisshowninthefollowingscreenshot:
![Page 102: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/102.jpg)
3. Afterbuildingthemap,wecansavethenamehotel_worldusingthefollowingcommand:
$rosrunmap_servermap_saver-f~/hotel_world
ThecommandissameforTurtleBot4. Aftersavingthemap,exitalltheotherapplicationsthatarecurrentlyinuse.5. Afterthemapisgenerated,thenextstepisautonomousnavigationandthelocalizationofthe
robotusingthebuiltmap.6. StartGazebousingthefollowingcommand:
$roslaunchchefbot_gazebochefbot_empty_world.launch
7. InTurtleBot,usethefollowingcommand:
$roslaunchturtlebotbot_gazebochefbot_empty_world.launch
8. StarttheamcldemoinChefBot.Notethepathbecauseitmayvaryforeachuser.9. ForChefBot,usethefollowingcommands:
$roslaunchchefbot_gazeboamcl_demo.launch
map_file:=/home/lentin/hotel_world.yaml
10. ForTurtleBot,usethefollowingcommands:
$roslaunchturtlebot_gazeboamcl_demo.launch
map_file:=/home/lentin/hotel_world.yaml
11. Startrvizusingthefollowingcommand.Thisisthesameforbothrobots:
$roslaunchturtlebot_rviz_launchersview_navigation.launch
![Page 103: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/103.jpg)
Nowwecancommandtherobottonavigatetoapositiononthemapusingthe2DNavGoalbutton.Clickonthisbuttonandselectapositionthatisnearatable.Afterclickingonthisposition,itwillplanthepathtothatpointandnavigatetothatposition,asshowninthefollowingscreenshot:
Therobotcanavoidobstaclesandalsoplantheshortestpathtothegoalposition.Afterseveralruns,wefoundthattherobotworksperfectlyifthemapwebuildwasaccurate.Themapbuildingprocedurecanbefine-tunedusingtheinstructionsat(http://wiki.ros.org/costmap_2d).Foranapplicationsuchasservingfood,itrequiresprettygoodaccuracyinthemap,sothattherobotcandeliverfoodtothecorrectposition.
![Page 104: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/104.jpg)
Questions1. Whatisrobotsimulationandwhatarethepopularrobotsimulators?2. WhatisROSandGazebo?3. WhatistherobotmodelinROS?4. WhatisgmappingandAMCL?
![Page 105: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/105.jpg)
SummaryInthischapter,youlearnedhowtosimulateacustomrobotcalledChefBot.Wediscussedthedesignofthetherobotinthepreviouschapter.Aftertherobotdesign,wemovedontosimulatetherobotinavirtualenvironmenttotestthedesignoftherobot,andcheckedwhetheritmetourspecifications.Inthischapter,youlearnedaboutsimulationandvarioussimulatorapplicationsusedinindustry,research,andeducationindetail.Afterthat,wediscussedhowtheROSframeworkandGazebosimulatorwasusedtoperformthesimulatorwork.Wealsocreatedasamplehello_worldpackageusingROS.WeinstalledtheTurtleBotstackandcreatedROSpackagesfromtheTurtleBotstack.Finally,wesimulatedtherobotandperformedgmappingandautonomousnavigationinahotelenvironment.Wegottoknowthattheaccuracyofthesimulationdependsonthemap,andthattherobotwillworkbetterinsimulationifthegeneratedmapisperfect.
![Page 106: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/106.jpg)
Chapter4.DesigningChefBotHardwareInthischapter,wewilldiscussthedesignandworkingofChefBothardwareandselectionofitshardwarecomponents.Inthepreviouschapter,wedesignedandsimulatedthebasicrobotframeworkinahotelenvironmentusingGazeboandROS,andtestedvariousmeasurementslikerobotbodymass,motortorque,wheeldiameter,andsoon.Also,wetestedtheautonomousnavigationcapabilityofChefBotinahotelenvironment.
Toachievethisgoalinhardware,weneedtoselectallhardwarecomponentsandfindhowtointerconnectallthesecomponents.Weknowthatthemainfunctionalityofthisrobotisnavigation;thisrobotwillhavetheabilitytonavigatefromthestartpositiontotheendpositionwithoutanycollisionwithitssurroundings.Wewilldiscussthedifferentsensorsandhardwarecomponentsrequiredtoachievethisgoal.Wewillseeablockdiagramrepresentationanditsexplanation,andalsodiscussthemainworkingoftherobot.Finally,weneedtoselectthecomponentsrequiredtobuildtherobot.Wecanalsoseetheonlinestoreswherewecanpurchasethesecomponents.
IfyouhaveaTurtleBot,youmayskipthischapterbecauseitisonlyforthosewhoneedtocreatetherobothardware.Let'sseewhatspecificationswehavetomeetinthehardwaredesign.Therobothardwaremainlyincludesrobotchassis,sensors,actuators,controllerboards,andPC.
SpecificationsoftheChefBothardwareInthissection,wewilldiscussingsomeoftheimportantspecificationsthatwementionedinChapter2,MechanicalDesignofaServiceRobot.Thefinalrobotprototypewillmeetthefollowingspecifications:
Simpleandcosteffectiverobotchassisdesign:Therobotchassisdesignshouldbesimpleandcosteffective.Autonomousnavigationfunctionality:Therobotshouldautonomouslynavigateanditshouldcontainnecessarysensorsfordoingthis.LongBatterylife:Therobotshouldhavealongbatterylifeinordertoworkcontinuously.Theworkingtimeshouldbegreaterthan1hour.Obstacleavoidance:Therobotshouldbeabletoavoidstaticanddynamicobjectsinthesurroundings.
Therobothardwaredesignshouldmeetthesespecifications.Let'slookatoneofthepossiblewaysofinterconnectingthecomponentsinthisrobot.Thenextsectionshowstheblockdiagramofarobotandexplainsit.
![Page 107: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/107.jpg)
BlockdiagramoftherobotTherobot'smovementiscontrolledbytwoDirectCurrent(DC)gearmotorswithanencoder.Thetwomotorsaredrivenusingamotordriver.Themotordriverisinterfacedintoanembeddedcontrollerboard,whichwillsendcommandstothemotordrivertocontrolthemotormovements.Theencoderofthemotorisinterfacedintothecontrollerboardforcountingthenumberofrotationsofthemotorshaft.Thisdataistheodometrydatafromtherobot.Thereareultrasonicsensors,whichareinterfacedintothecontrollerboardforsensingtheobstaclesandmeasuringthedistancefromtheobstacles.ThereisanIMUsensortoimproveodometrycalculation.TheembeddedcontrollerboardisinterfacedintoaPC,whichdoesallthehigh-endprocessingintherobot.VisionandsoundsensorsareinterfacedintothePCandWi-Fiisattachedforremoteoperations.
Eachblockoftherobotisexplainedinthefollowingdiagram:
RobotHardwareblockDiagram
MotorandencoderTherobotthatwearegoingtodesignisadifferentialdriverobotwithtwowheels,sowerequiretwomotorsforitslocomotion.Eachmotorconsistsofquadratureencoders(http://letsmakerobots.com/node/24031)togetthemotorrotationfeedback.
Thequadratureencoderwillgiveafeedbackoftherotationofthemotorassquarepulses;wecandecodethepulsetogetthenumberofticksoftheencoder,whichcanbeusedforfeedback.Ifweknowthewheeldiameterandthenumberofticksofthemotor,wecancomputethedisplacementandtheangleoftherobotthattraversed.Thiscomputationisveryusefulfornavigationoftherobot.
![Page 108: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/108.jpg)
Selectingmotors,encoders,andwheelsfortherobot
Fromthesimulation,wegotanideaabouttherobotparameters.Onthesimulationparameters,wementionedthatthemotortorqueneededtodrivetherobotis18kg-cm,butthecalculatedtorqueislessthanthis;weareselectingahightorquemotorforbetterperformance.OneoftheeconomicalmotorsthatwemightconsiderusingisfromPololu.WecanselectahightorqueDCgearmotorwithanencoderworkingat12VDCandhavingspeedof80RPMaccordingtoourdesign.Wearechoosingthefollowingmotorforthedrivesysteminthisrobot:
http://www.pololu.com/product/1447
Thefollowingfigureshowstheimageoftheselectedmotorforthisrobot.Themotorcomeswithanintegratedquadratureencoderwitharesolutionof64countsperrevolutionofthemotorshaft,whichcorrespondsto8400countsperrevolutionofthegearbox'soutputshaft.
DCGearmotorwithencoderandwheel
Thismotorhas6pinswithdifferentcolors.Thepindescriptionofthismotorisgiveninthefollowingtable:
Color Function
Red Motorpower(connectstoonemotorterminal)
Black Motorpower(connectstotheothermotorterminal)
Green EncoderGND
![Page 109: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/109.jpg)
Blue EncoderVcc(3.5V-20V)
Yellow EncoderAoutput
White EncoderBoutput
Accordingtoourdesign,wechoseawheeldiameterof90mm.Pololuprovidesa90mmwheel,whichisavailableathttp://www.pololu.com/product/1439.Theprecedingfigureshowedthemotorassembledwiththiswheel.
Theotherconnectorsneededtoconnectthemotorsandwheelstogetherareavailableasfollows:
Themountinghubrequiredtomountthewheeltothemotorshaftisavailableathttp://www.pololu.com/product/1083TheL-bracketforthemotortomountonrobotchassisisavailableathttp://www.pololu.com/product/1084
MotordriverAmotordriverormotorcontrollerisacircuitthatcancontrolthespeedofthemotor.Controllingmotorsmeansthatwecancontrolthevoltageacrossthemotorandcanalsocontrolthedirectionandspeedofthemotor.Motorscanrotateclockwiseorcounterclockwise,ifwechangethepolarityofmotorterminal.
H-bridgecircuitsarecommonlyusedinmotorcontrollers.H-bridgeisanelectroniccircuitthatcanapplyvoltageineitherdirectionofload.Ithashighcurrenthandlingpropertiesandcanchangethedirectionofcurrentflow.
ThefollowingscreenshotshowsabasicH-bridgecircuitusingswitches:
HBridgecircuit
![Page 110: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/110.jpg)
Thedirectionofthemotor,dependingonthefourswitches,isgivenasfollows:
S1 S2 S3 S4 Result
1 0 0 1 Motormovesright
0 1 1 0 Motormovesleft
0 0 0 0 Motorfreeruns
0 1 0 1 Motorbrakes
1 0 1 0 Motorbrakes
1 1 0 0 Motorshootsthrough
0 0 1 1 Motorshootsthrough
1 1 1 1 Motorshootsthrough
WehaveseenthebasicsofanH-bridgecircuitonthemotordrivercircuit.Now,wecanselectoneofthemotordriversforourapplicationanddiscusshowitworks.
Selectingamotordriver/controller
TherearesomemotordriversavailablewithPololu,whicharecompatiblewiththeselectedmotor.Thefollowingfigureshowsoneofthemotordriversthatwewilluseinourrobot:
DualVNH2SP30MotorDriverCarrierMD03A
Thismotordriverisavailableathttp://www.pololu.com/product/708.
![Page 111: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/111.jpg)
Thisdrivercandrivetwomotorswithacombinedmaximumcurrentratingof30A,andcontainstwointegratedICfordrivingeachofthemotors.Thepindescriptionofthisdriverisgivenintheupcomingsections.
Inputpins
Thefollowingpinsaretheinputpinsofthemotordriver,bywhichwecancontrolmainlythemotorspeedanddirection:
PinName Function
1DIAG/EN,2DIAG/EN Thismonitorsthefaultconditionofmotordriver1and2.Innormaloperation,itwillremaindisconnected.
1INa,1INb,2INa,2INb
Thesepinswillcontrolthedirectionofmotor1and2inthefollowingmanner:
IfINA=INB=0,motorwillbreakIfINA=1,INB=0,motorwillrotateclockwiseIfINA=0,INB=1,motorrotatecounterclockwiseIfINA=INB=1,motorwillbreak
1PWM,2PWM Thiswillcontrolthespeedofmotor1and2byrapidlyturningthemonandoff.
1CS,2CS Thisisthecurrentsensingpinforeachmotor.
Outputpins
Theoutputpinsofthemotordriverwilldrivethetwomotors.Thefollowingaretheoutputpins:
PinName Function
OUT1A,OUT1B Thesepinsconnecttomotor1powerterminals
OUT2A,OUT2B Thesepinsconnecttomotor2powerterminals
Powersupplypins
Thefollowingarethepowersupplypins:
Pinname Function
VIN(+),GND(-) Thesearethesupplypinsofthetwomotors.Thevoltagerangesfrom5.5Vto16V.
+5VIN,GND(-) Thisisthesupplyofmotordriver.Thevoltageshouldbe5V.
EmbeddedcontrollerboardControllerboardsaretypicallyI/Oboards,whichcansendcontrolsignalsintheformofdigital
![Page 112: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/112.jpg)
pulsestotheH-Bridge/motordriverboardandcanreceiveinputsfromsensorssuchasultrasonicandIRsensors.Wecanalsointerfacemotorencoderstothecontrolboardforthemotorfeedback.
ThemainfunctionalitiesofLaunchpadinthisrobotare:
InterfacingthemotordriverandencoderInterfacingtheultrasonicsoundsensorSendingandreceivingsensorvaluestoPCandfromPC
WewilldealwithI/Oboardsandinterfacingwithdifferentcomponentsintheupcomingchapters.SomeofthepopularI/OboardsareArduino(arduino.cc)andTivaCLaunchPad(http://www.ti.com/tool/EK-TM4C123GXL)byTexasInstruments.WeareselectingTivaCLaunchPadoverArduinobecauseoffollowingfactors:
TivaCLaunchPadhasamicrocontrollerbasedon32-bitARMCortex-M4with256KBFlashmemory,32KBSRAM,and80MHzoperation;however,mostoftheArduinoboardsrunbelowthisspecification.Outstandingprocessingperformance,combinedwithfastinterrupthandling.12Timers.16PWMOutputs.Twoquadratureencoderinputs.EightUniversalAsynchronousReceiver/Transmitter(UART).5VtolerantGeneral-PurposeInput/Output(GPIO).LowcostandsizecomparedtoArduinoboards.EasyprogrammableinterfaceIDEcalledEnergia(http://energia.nu/).ThecodewritteninEnergiaisArduinoboardcompatible.
ThefollowingimageshowstheTexasInstrument'sTivaCLaunchPad:
TivaCLaunchpad
![Page 113: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/113.jpg)
ThepinoutofTexasInstrumentLaunchpadseriesisgivenathttp://energia.nu/pin-maps/guide_stellarislaunchpad/.ThispinoutiscompatiblewithalltheLaunchpadseries.ThisisalsousedwhileprogramminginEnergiaIDE.
UltrasonicsensorsUltrasonicsensorsarealsocalledpingsensors,andaremainlyusedtomeasuretherobot'sdistancefromanobject.Themainapplicationofpingsensorsistoavoidobstacles.Theultrasonicsensorsendshighfrequencysoundwavesandevaluatestheechoesthatarereceivedfromtheobject.Thesensorwillcalculatethedelaybetweensendingandreceivingtheecho,andfromthat,determineitsdistancefromanobject.
Inourrobot,collision-freenavigationisanimportantaspect,otherwisetherewillbedamagetotherobot.Youwillseeafigureshowinganultrasonicsensorinthenextsection.Thissensorcanbeemployedonthesidesofarobottodetectcollisiononthesidesandbackoftherobot.Thekinectisalsomainlyusedforobstacledetectionandcollisionavoidance.Theaccuracyofkinectcanonlybeexpectedfrom0.8m,sothatdistanceinbetween0.8mcanbedetectedusingultrasonicsensor.Itisactuallyanadd-ontoourrobotforincreasingcollisionavoidanceanddetection.
Selectingtheultrasonicsensor
OneofthepopularandcheapultrasonicsensorsavailableisHC-SR04.Weareselectingthissensorforourrobotbecauseofthefollowingfactors:
Rangeofdetectionisfrom2cmto4mWorkingvoltageis5VWorkingcurrentisverylowtypically15mA
Wecanusethissensorforaccuratedetectionofobstacles;italsoworkswith5V.HereistheimageofHC-SR04anditspinout:
Ultrasonicsoundsensor
![Page 114: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/114.jpg)
Thepinsanddescriptionaregivenasfollows:
Pins Function
Vcc,GND Thesearethesupplypinsofultrasonicsensor.Normally,weneedtoapply5Vforanormaloperation.
Trig Thisistheinputpinofthesensor.Weneedtoapplyapulsewithaparticulardurationtothispintosendtheultrasonicsoundwaves.
Echo Thisistheoutputpinofthesensor.Itwillgenerateapulseonthispinwithatimeduration,accordingtothedelayinreceivingthetriggeredpulse.
InertialMeasurementUnitWewilluseInertialMeasurementUnit(IMU)inthisrobottogetagoodestimateoftheodometryvalueandtherobotpose.Theodometryvaluescomputedfromtheencoderalonemaynotbesufficientforefficientnavigation,itcouldcontainerrors.Tocompensateforerrorsduringtherobot'smovement,wewilluseIMUinthisrobot.WeareselectingMPU6050forIMUbecauseoffollowingreasons:
InMPU6050,theaccelerometerandgyroscopeareintegratedonasinglechipItprovideshighaccuracyandsensitivityThereisprovisiontointerfacemagnetometerforbetterIMUperformanceThebreakoutboardofMPU6050isverycheapTheMPU6050candirectlyinterfacetoLaunchpad,bothare3.3Vcompatibleandsoftwarelibrariesarealsoavailableforeasierinterfacing
ThefollowingfigureshowsthebreakoutboardofMPU6050:
![Page 115: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/115.jpg)
Thepinsandtheirdescriptionsaregivenasfollows:
Pins Functions
VDD,GND Supplyvoltage2.3V-3.4V
INT Thispinwillgenerateaninterruptwhendatacomestothedevicebuffer
SCL,SDA SerialDataLine (SDA)andSerialClockLine (SCL)areusedforI2Ccommunication
ASCL,ASDA AuxiliaryI2CforcommunicationwithMagnetometer
Wecanpurchasethebreakoutboardfromhttps://www.sparkfun.com/products/11028.
KinectKinectisa3Dvisionsensor,mainlyusedin3Dvisionapplicationandmotiongaming.Weareusingkinectfor3Dvision.Usingkinect,therobotwillgetthe3Dimageofitssurroundings.The3Dimagesareconvertedtofinerpointscalledpointcloud.Thepointclouddatawillhaveall3Dparametersofthesurrounding.
![Page 116: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/116.jpg)
Themainuseofkinectontherobotistomockthefunctionalityofalaserscanner.ThelaserscannerdataisessentialforanalgorithmcalledSLAM,usedforbuildingamapoftheenvironment.Thelaserscannerisaverycostlydevice,soinsteadofbuyinganexpensivelaserscanner,wecanconvertakinectintoavirtuallaserscanner.AnotheralternativetokinectisAsusXtionPRO(http://www.asus.com/Multimedia/Xtion_PRO/).Thiswillsupportthesamesoftwarewrittenforkinect.Thepointcloudtolaserdataconversionisdoneonthesoftware,sononeedtochangethehardwareparts.Aftergeneratingamapoftheenvironment,therobotcannavigateitssurroundings.
Thefollowingimageshowsthevariouspartsofakinectsensor:
Kinect
ThekinectmainlyhasanIRcameraandIRprojectorandalsohasanRGBcamera.TheIRcameraandprojectorgeneratesthe3Dpointcloudofthesurroundings.Italsohasamicarrayandmotorizedtiltformovingthekinectupanddown.
Wecanpurchasekinectfromhttp://www.amazon.co.uk/Xbox-360-Kinect-Sensor-Adventures/dp/B0036DDW2G.
CentralProcessingUnitTherobotismainlycontrolledbyitsnavigationalalgorithmthatisrunningonitsPC.WecanchoosealaptoporminiPCornetbookfortheprocessing.Recently,IntellaunchedaminicomputercalledIntelNextUnitofComputing(NUC).Ithasanultrasmallformfactor(smallsize),islightweight,andhasagoodcomputingprocessorwithIntelCeleron,Corei3,orCorei5.Itcansupportupto16GBofRAMandhasintegratedWi-Fi/Bluetooth.WearechoosingIntelNUCbecauseofitsperformance,ultrasmallformfactor,andlightweight.Wearenotgoingforapopularboard,suchasRaspberryPi(http://www.raspberrypi.org/)orBeagleBone(http://beagleboard.org/)becausewerequirehighcomputingpowerinthiscase,whichcannotbeprovidedbytheseboards.
![Page 117: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/117.jpg)
TheNUCweareusingisIntelDN2820FYKH.Herearethespecificationsofthiscomputer:
IntelCeleronDualCoreprocessorwith2.39GHz4GBRAM500GBharddiskIntelintegratedgraphicsHeadphone/microphonejack12Vsupply
ThefollowingimageshowstheIntelNUCminicomputer:
IntelNUCDN2820FYKH
WecanpurchaseNUCfromhttp://goo.gl/Quzi7a.
Speakers/micThemainfunctionoftherobotisautonomousnavigation.Weareaddinganadditionalfeatureinwhichtherobotcaninteractwithusersthroughspeech.Therobotcanbegivencommandsusingvoiceandcanspeaktotheuserusingatexttospeech(TTS)engine,whichcanconverttexttospeechformat.Amicrophoneandspeakersareessentialforthisapplication.Thereisnoparticularselectionforthishardware.IfthespeakerandmicareUSBcompatible,thenitwillbegreat.AnotheralternativeisaBluetoothheadset.
Powersupply/batteryOneoftheimportanthardwarecomponentisthepowersupply.Wesawinthespecificationthattherobothastoworkformorethan1hour;itwillbegoodifthesupplyvoltageofthebatteryiscommontothecomponents.Also,ifthesizeandweightofthebatteryisless,itwillnotaffecttherobot'spayload.Anotherconcernisthatthemaximumcurrentneededfortheentirecircuitshouldnotexceedthebattery'smaximumcurrent,whichitcansource.Themaximumvoltageandcurrent
![Page 118: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/118.jpg)
distributionofeachpartofthecircuitisasfollows:
Components Maximumcurrent(Ampere)
IntelNUCPC 12V,3A
Kinect 12V,1A
Motors 12V,0.7A
Motordriver,Ultrasonicsensor,IMU,Speakers 5V,<0.5A
Tomeetthesespecifications,weareselectinga12V,9AHLi-Polymerbatteryforouroperation.Thisbatterycanalsosourcemaximumcurrentupto5Ampere.
Thefollowingimageshowsourselectedbatteryforthisrobot:
Wecanbuythefollowingbatteryfromhttp://goo.gl/Clzk6I.
![Page 119: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/119.jpg)
WorkingoftheChefBothardwareWecanexplaintheworkingoftheChefBothardwareusingthefollowingblockdiagram.Thisimprovedversionofourfirstblockdiagrammentionsthevoltageofeachcomponentanditsinterconnection:
ThemainaimofthischapterwastodesignthehardwareforChefBot,whichincludedfindingtheappropriatehardwarecomponentsandfindingtheinterconnectionbetweeneachpart.Themainfunctionalityofthisrobotistoperformautonomousnavigation.Thehardwaredesignofrobotisoptimizedforautonomousnavigation.
Therobotdriveisbasedondifferentialdrivesystem,whichconsistsoftwomotorsandtwowheels.Therearecasterwheelsforsupportingthemainwheels.Thesetwomotorscanmovetherobotinanyposeina2Dplanebyadjustingtheirvelocitiesanddirection.
Forcontrollingthevelocityanddirectionofthewheels,wehavetointerfaceamotorcontroller,whichcandothesefunctions.Themotordriverwechoseshouldabletocontroltwomotorsatatimeandchangethedirectionandspeed.
ThemotordriverpinsareinterfacedtoamicrocontrollerboardcalledTivaCLaunchPad,whichcansendthecommandstochangethedirectionandspeedofthemotor.ThemotordriverisinterfacedintoLaunchpadwiththehelpofalevelshifter.Thelevelshifterisacircuit,whichcanshiftvoltage
![Page 120: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/120.jpg)
levelsfrom3.3Vto5Vandviceversa.Weareusingalevelshifterbecausethemotordriverisoperatingat5Vlevel,buttheLaunchpadisoperatingat3.3V.
Eachmotorhasarotationfeedbacksensorcalledtheencoder,whichcanbeusedtoestimatetherobot'sposition.TheencodersareinterfacedintotheLaunchpadalongwiththelevelshifter.
OthersensorsinterfacedintoLaunchpadincludeultrasonicsoundsensorandIMU.Ultrasonicsoundsensorcandetectobjectsthatarecloseby,butcannotbedetectedbythekinectsensor.IMUisusedalongwiththeencoders,togetagoodrobotposeestimation.
AllsensorvaluesarereceivedontheLaunchpadandsenttoPCviaUSB.TheLaunchpadrunsafirmwarecodethatcanreceiveallsensorvaluesandsendtothePC.
ThePCisinterfacedtokinect,Launchpad,Speaker,andMic.ThePChasROSrunningonitanditwillreceivekinectdataandconvertedtodataequivalenttolaserscanner.ThisdatacanbeusedtobuildthemapoftheenvironmentusingSLAM.Thespeaker/micisusedforcommunicationbetweentheuserandrobot.ThespeedcommandsgeneratedinROSnodesaresenttoLaunchpad.LaunchpadwillthenprocessthespeedcommandsandsendappropriatePWMvaluestothemotordrivercircuit.
Afterdesigninganddiscussingtheworkingoftherobothardware,wewilldiscussthedetailedinterfacingofeachcomponentandthefirmwarecodingnecessaryfortheinterfacing.
![Page 121: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/121.jpg)
Questions1. Whatisrobothardwaredesigningallabout?2. WhatisH-bridgeandwhatareitsfunctions?3. Whicharethecomponentsessentialfortherobotnavigationalgorithm?4. Whatisthecriteriathathastobekeptinmindwhileselectingroboticcomponents?5. WhatarethemainapplicationsofKinectonthisrobot?
![Page 122: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/122.jpg)
SummaryInthischapter,wehaveseenthefeaturesoftherobotthatwearegoingtodesign.Themainfeatureofthisrobotisautonomousnavigation.Therobotcannavigateinitssurroundingbyanalyzingsensorreadings.Wewentthroughtherobotblockdiagraminwhichwediscussedtheroleofeachblock,andwethenselectedappropriatecomponentsthatsatisfytheserequirements.Thischapteralsosuggestedsomeeconomicalcomponentstobuildthisrobot.Inthenextchapter,wewillfindoutmoreaboutactuatorsandtheirinterfacingthatweusingonthisrobot.
![Page 123: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/123.jpg)
Chapter5.WorkingwithRoboticActuatorsandWheelEncodersInthischapter,wewillcover:
InterfacingaDCGearedmotorwithTivaCLaunchPadInterfacingaquadratureencoderwithTivaCLaunchPadExplanationofinterfacingcodeInterfacingDynamixelactuators
Inthepreviouschapter,wehavediscussedtheselectionofhardwarecomponentsneededtobuildourrobot.Oneoftheimportantcomponentsinrobothardwareistheactuator.Actuatorsprovidemobilitytotherobot.Inthischapter,weareconcentratingonthedifferenttypesofactuatorsthatwearegoingtouseinthisrobotandhowtheycanbeinterfacedwithTivaCLaunchPad,whichisa32-bitARMmicrocontrollerboardfromTexasInstrumentthatworksat80MHz.ThefirstactuatorthatwearegoingtodiscussisaDCgearedmotorwithanencoder.ADCgearedmotorworksusingdirectcurrent,andhasgearreductiontoreducetheshaftspeedandincreasethetorqueofthefinalshaft.Thesekindofmotorsareveryeconomicandwecanusethiskindofmotorinourrobotprototype.
Inthefirstsectionofthischapter,wewilldealwiththedesignofourrobotdrivesystem.TheDrivesystemofourrobotconsistsoftwoDCgearedmotorswithencodersandamotordriver.ThemotordriveriscontrolledbyTivaCLaunchPad.WewillseeinterfacingofmotordriverandquadratureencoderwithTivaCLaunchpad.
Inthelastsection,wewillexploresomeofthelatestactuatorswhichcanreplacetheexistingDCgearedmotorwithencoder.Ifthedesiredrobotneedsmorepayloadandaccuracy,wehaveswitchtothesekindofactuators.
InterfacingDCgearedmotorwithTivaCLaunchPadInthepreviouschapter,weselectedaDCgearedmotorwithanencoderfromPololuandtheembeddedboardfromTexasInstrumentscalledTivaCLaunchPad.WeneedthefollowingcomponentstointerfacethemotorwithLaunchpad:
TwoPololumetalgearmotors37Dx57Lmmwith64countperrevolutionencoderPololuwheel90x10mmandamatchinghubPololudualVNH2SP30motordrivercarrierMD03AAsealedleadacid/LithiumIonbatteryof12VAlogiclevelconvertorof3.3Vto5Vhttps://www.sparkfun.com/products/11978.ATivaCLaunchPadanditscompatibleinterfacingwires
ThefollowingfigureshowstheinterfacingcircuitoftwomotorsusingPololuH-Bridge:
![Page 124: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/124.jpg)
Motorinterfacingcircuit
TointerfacewithLaunchpad,wehavetoconnectalevelshifterboardinbetweenthesetwo.Themotordriverworksin5VbuttheLaunchpadworksin3.3V,sowehavetoconnectalevelshifter,asshowninthefollowingfigure:
Levelshiftercircuit
ThetwogearedDCmotorsareconnectedtoOUT1A,OUT1B,andOUT2A,OUT2Bofthemotordriver.VIN(+)andGND(-)arethesupplyvoltagesofthemotor.TheseDCmotorscanworkwith12Voltsupply,sowegive12Voltastheinputvoltage.Themotordriverwillsupportaninputvoltage
![Page 125: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/125.jpg)
rangeof5.5Vto16V.
Thecontrolsignals/inputpinsofmotordriversareontheleft-handsideofthedriver.Thefirstpinis1DIAG/EN,inmostcasesweleavethispindisconnected.Thesepinsareexternallypulledhighinthedriverboarditself.ThemainuseofthispinistoenableordisabletheH-bridgechip.ItisalsousedtomonitorthefaultyconditionoftheH-bridgeIC.Pins1INAand1INBcontrolthedirectionofrotationofthemotor.The1PWMpinwillswitchthemotortoONandOFFstate.WeachievespeedcontrolusingPWMpins.TheCSpinwillsensetheoutputcurrent.Itwilloutput0.13VoltsperAmpereoftheoutputcurrent.TheVINandGNDpinsgivethesameinputvoltagethatwesuppliedtothemotor.Wearenotusingthesepinshere.The+5V(IN)andGNDpinsarethesupplyforthemotordriverIC.Thesupplytothemotordriverandmotorsisdifferent.
Thefollowingtableshowsthetruthtableoftheinputandoutputcombinations:
INA INB DIAGA/ENA DIAGB/ENB OUTA OUTB CS Operatingmode
1 1 1 1 H H HighImp BraketoVcc
1 0 1 1 H L Isense=Iout/K Clockwise(CW)
0 1 1 1 L H Isense=Iout/K Counterclockwise(CCW)
0 0 1 1 L L HighImp BrakertoGND
ThevalueDIAG/ENpinsarealwayshighbecausethispinisexternallypulledhighinthedriverboarditself.Usingthesesignalcombinations,wecanmovetherobotinanydirectionandbyadjustingthePWMSignal,wecanadjustthespeedofthemotortoo.ThisisthebasiclogicbehindcontrollingaDCmotorusinganH-bridgecircuit.
WhileinterfacingmotortoLaunchpad,wemayrequirealevelshifter.ThisisbecausetheoutputpinsofLaunchpadcanonlysupply3.3Voltbutthemotordriverneeds5Vtotrigger;so,wehavetoconnect3.3Vtoa5Vlogiclevelconvertortostartworking.
Thetwomotorsworkinadifferentialdrivemechanism.Thefollowingsectiondiscussesdifferentialdriveanditsoperation.
DifferentialwheeledrobotTherobotwehavedesignedisadifferentialwheeledrobot.Inadifferentialwheeledrobot,themovementisbasedontwoseparatelydrivenwheelsplacedoneithersideoftherobot'sbody.Itcanchangeitsdirectionbychangingtherelativerateofrotationofitswheels,andhence,doesn'trequireadditionalsteeringmotion.Tobalancetherobot,afreeturningwheelorcasterwheelsmaybeadded.
Thefollowingfigureshowsatypicalrepresentationofdifferentialdrive:
![Page 126: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/126.jpg)
Ifthetwomotorsaremovinginthesamedirection,therobotwillmoveforwardorbackward.Ifonemotorhasmorespeedthantheother,thentherobotturnstotheslowermotorside;sototurnleft,stoptheleftmotorandmovetherightmotor.Thefollowingfigureshowshowweconnectthetwomotorsinourrobot.Thetwomotorsaremountedontheoppositesidesofthebaseplateandweputtwocastersinfrontandbackoftherobotinordertobalanceit:
![Page 127: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/127.jpg)
Topviewofrobotbase
Next,wecanprogramthemotorcontrollerusingLaunchpadaccordingtothetruthtabledata.ProgrammingisdoneusinganIDEcalledEnergia(http://energia.nu/).WeareprogrammingLaunchpadusingalanguagecalledWiring(http://wiring.org.co/).
InstallingtheEnergiaIDEWecandownloadthelatestversionofEnergiafromthefollowinglink:
http://energia.nu/download/
WewilldiscusstheinstallationproceduremainlyonUbuntu14.04.2,64-bit.TheEnergiaversionthatwewilluseis0101E0013:
1. DownloadEnergiaforLinux64-bitfromtheabovelink.2. ExtracttheEnergiacompressedfileintotheHomefolderoftheuser.3. AddrulestosetreadandwritepermissiontoTivaCLaunchPad.Thisisessentialforwriting
firmwaretoLaunchpad.ThefollowingcommandwilladdpermissionforUSBdevice:
$echo'ATTRS{idVendor}=="1cbe",ATTRS{idProduct}=="00fd",GROUP="users",
MODE="0660"'|\
sudotee/etc/udev/rules.d/99-tiva-launchpad.rules
4. Afterenteringthecommand,plugLaunchpadtoPC.5. StartEnergiausingthefollowingcommandinsidethefolder:
$./energia
ThefollowingscreenshotshowstheEnergiaIDE:
![Page 128: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/128.jpg)
6. Now,selecttheboardbynavigatingtoTools|Boards|Launchpad(TivaC)w/tm4c123(80MHz)asshowninthefollowingscreenshot:
![Page 129: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/129.jpg)
7. Then,selecttheSerialPortbynavigatingtoTools|SerialPort|/dev/ttyACM0asshowninthefollowingscreenshot:
![Page 130: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/130.jpg)
8. Compilethecodeusingthecompilebutton.Thescreenshotofasuccessfulcompilationisgivenhere:
![Page 131: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/131.jpg)
9. Aftersuccessfulcompilation,uploadthecodeintotheboardbyclickingontheUploadbutton.Theuploadedcodewasanemptycodewhichperformsnooperations.
Iftheuploadingissuccessful,thefollowingmessagewillbeshown:
![Page 132: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/132.jpg)
UsethefollowingtutorialtoinstallEnergiaonMacOSXandWindows:
Refertohttp://energia.nu/Guide_MacOSX.htmlforMacOSXRefertohttp://energia.nu/Guide_Windows.htmlforWindows
InterfacingcodeThefollowingcodecanbeusedtotestthetwomotorsindifferentialdriveconfiguration.Thiscodecanmovetherobotforwardfor5secondsandbackwardfor5seconds.Then,itmovestherobottotheleftfor5secondsandrightfor5seconds.Aftereachmovement,therobotwillstopfor1second.
Atthebeginningofthecode,wedefinepinsforINA,INB,andPWMofthetwomotorsasfollows:
///LeftMotorPins
#defineINA_112
#defineINB_113
#definePWM_1PC_6
![Page 133: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/133.jpg)
///RightMotorPins
#defineINA_25
#defineINB_26
#definePWM_2PC_5
ThepinoutforLaunchpadisgivenat:
http://energia.nu/pin-maps/guide_tm4c123launchpad/
Thefollowingcodeshowsthefivefunctionstomovetherobotforward,backward,leftandright.Thefifthfunctionistostoptherobot.WewillusethedigitalWrite()functiontowriteadigitalvaluetoapin.ThefirstargumentofdigitalWrite()isthepinnumberandsecondargumentisthevaluetobewrittentothepin.ThevaluecanbeHIGHorLOW.WewillusetheanalogWrite()functiontowriteaPWMvaluetoapin.ThefirstargumentofthisfunctionisthepinnumberandsecondisthePWMvalue.Therangeofthisvalueisfrom0-255.AthighPWM,themotordriverwillswitchfastandhavemorespeed.AtlowPWM,switchinginsidethemotordriverwillbeslow,sothemotorwillalsobeslow.Currently,wearerunningatfullspeed.
voidmove_forward()
{
//SettingCWrotationtoandLeftMotorandCCWtoRightMotor
//LeftMotor
digitalWrite(INA_1,HIGH);
digitalWrite(INB_1,LOW);
analogWrite(PWM_1,255);
//RightMotor
digitalWrite(INA_2,LOW);
digitalWrite(INB_2,HIGH);
analogWrite(PWM_2,255);
}
///////////////////////////////////////////////////////
voidmove_left()
{
//LeftMotor
digitalWrite(INA_1,HIGH);
digitalWrite(INB_1,HIGH);
analogWrite(PWM_1,0);
//RightMotor
digitalWrite(INA_2,LOW);
digitalWrite(INB_2,HIGH);
analogWrite(PWM_2,255);
}
//////////////////////////////////////////////////////
voidmove_right()
{
//LeftMotor
digitalWrite(INA_1,HIGH);
digitalWrite(INB_1,LOW);
analogWrite(PWM_1,255);
![Page 134: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/134.jpg)
//RightMotor
digitalWrite(INA_2,HIGH);
digitalWrite(INB_2,HIGH);
analogWrite(PWM_2,0);
}
////////////////////////////////////////////////////////
voidstop()
{
//LeftMotor
digitalWrite(INA_1,HIGH);
digitalWrite(INB_1,HIGH);
analogWrite(PWM_1,0);
//RightMotor
digitalWrite(INA_2,HIGH);
digitalWrite(INB_2,HIGH);
analogWrite(PWM_2,0);
}
/////////////////////////////////////////////////
voidmove_backward()
{
//LeftMotor
digitalWrite(INA_1,LOW);
digitalWrite(INB_1,HIGH);
analogWrite(PWM_1,255);
//RightMotor
digitalWrite(INA_2,HIGH);
digitalWrite(INB_2,LOW);
analogWrite(PWM_2,255);
}
WefirstsettheINA,INBpinsofthetwomotortotheOUTPUTmode,sothatwecanwriteHIGHorLOWvaluestothesepins.ThefunctionpinMode()isusedtosetthemodeoftheI/Opin.ThefirstargumentofpinMode()isthepinnumberandsecondargumentisthemode.Wecansetapinasinputoroutput.Tosetapinasoutput,giveOUTPUTargumentasthesecondargument;tosetitasinput,giveINPUTasthesecondargumentasshowninfollowingcode.ThereisnoneedtosetthePWMpinastheoutputbecauseanalogWrite()writesthePWMsignalwithoutsettingpinMode():
voidsetup()
{
//SettingLeftMotorpinasOUTPUT
pinMode(INA_1,OUTPUT);
pinMode(INB_1,OUTPUT);
pinMode(PWM_1,OUTPUT);
//SettingRightMotorpinasOUTPUT
pinMode(INA_2,OUTPUT);
pinMode(INB_2,OUTPUT);
pinMode(PWM_2,OUTPUT);
}
![Page 135: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/135.jpg)
Thefollowingsnippetisthemainloopofthecode.Itwillcalleachfunction,moveforward(),move_backward(),move_left(),andmove_right()for5seconds.Aftercallingeachfunction,therobotstopsfor1second.
voidloop()
{
//Moveforwardfor5sec
move_forward();
delay(5000);
//Stopfor1sec
stop();
delay(1000);
//Movebackwardfor5sec
move_backward();
delay(5000);
//Stopfor1sec
stop();
delay(1000);
//Moveleftfor5sec
move_left();
delay(5000);
//Stopfor1sec
stop();
delay(1000);
//Moverightfor5sec
move_right();
delay(5000);
//Stopfor1sec
stop();
delay(1000);
}
![Page 136: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/136.jpg)
InterfacingquadratureencoderwithTivaCLaunchpadThewheelencoderisasensorattachedtothemotortosensethenumberofrotationsofthewheel.Ifweknowthenumberofrotations,wecancomputethedisplacement,velocity,acceleration,andangleofthewheel.
Forthisrobot,wehavechosenamotorwithanin-builtencoder.Thisencoderisaquadraturetype,whichcansenseboththedirectionandspeedofthemotor.Encodersusedifferenttypesofsensors,suchasopticalandhallsensors,todetecttheseparameters.ThisencoderusesHalleffecttosensetherotation.Thequadratureencoderhastwochannels,namelyChannelAandChannelB.Eachchannelwillgeneratedigitalsignalswithninetydegreephaseshift.Thefollowingfigureshowsthewaveformofatypicalquadratureencoder:
Quadratureencoderwaveforms
Ifthemotorrotatesclockwise,ChannelAwillleadChannelB,andifthemotorrotatescounterclockwise,ChannelBwillleadChannelA.Thisreadingwillbeusefultosensethedirectionofrotationofthemotor.Thefollowingsectiondiscusseshowwecantranslatetheencoderoutputtousefulmeasurementslikedisplacementandvelocity.
ProcessingencoderdataEncoderdataisatwochannelpulseoutwith90degreeoutofphase.Usingthisdata,wecanfindthedirectionofrotationandhowmanytimesthemotorhasrotated,andtherebyfindthedisplacementandvelocity.
Someofthetermsthatspecifyencoderresolutionarepulsesperrevolution(PPR)orlinesperrevolution(LPR)andcountsperrevolution(CPR).PPRspecifieshowmanyelectricalpulses(0to1transitions)therewillbeduringonerevolutionofamotor'sfinalshaft.SomemanufacturersusethenameCPRinsteadofPPR.Becauseeachpulsewillcontaintwoedges(risingandfalling)andthere
![Page 137: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/137.jpg)
aretwopulsechannels(AandB)witha90degreephaseshift,thetotalnumberofedgeswillbefourtimesthenumberofPPR.Mostquadraturereceiversusethesocalled4Xdecodingtocountalltheedgesfromanencoder'sAandBchannelsyielding4XresolutioncomparedtotherawPPRvalue.
Inourmotor,PololuspecifiesthattheCPRis64forthemotorshaft,whichcorrespondsto8400CPRofthegearbox'soutputshaft.Ineffect,weget8400countsfromthegearboxoutputshaftwhenthemotor'sfinalshaftcompletesonerevolution.Thefollowingfigureshowshowwecancomputethecountfromtheencoderpulses:
Encoderwaveformwithcountwaveform
Inthisencoderspecification,theygivethecountperrevolution;itiscalculatedbytheencoderchanneledgetransitions.Onepulseofanencoderchannelcorrespondstofourcounts.Sotoget8400countsinourmotor,thePPRwillbe8400/4=2100.Fromtheprecedingfigure,wewillbeabletocalculatethenumberofcountsinonerevolution,butwealsoneedtosensethedirectionofthemovement.Thisisbecauseirrespectiveofwhethertherobotmovesforwardorbackward,thecountsthatwegetwillbesame;sosensingthedirectionisimportantinordertodecodethesignal.Thefollowingfigureshowshowwecandecodetheencoderpulses:
![Page 138: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/138.jpg)
Ifweobservethecodepattern,wecanunderstandthatitfollowsthe2-bitGraycode.AGraycodeisencodingofnumbers,suchthatadjacentnumbershaveasingledigitdifferingby1.Graycodes(http://en.wikipedia.org/wiki/Gray_code)arecommonlyusedinrotaryencodersforefficientcoding.
Wecanpredictthedirectionofrotationofamotorbystatetransitions.Thestatetransitiontableisgiveninthefollowingfigure:
State Clockwisetransition Counterclockwisetransition
0,0 0,1to0,0 1,0to0,0
1,0 0,0to1,0 1,1to1,0
1,1 1,0to1,1 0,1to1,1
0,1 1,1to0,1 0,0to0,1
Itwillbemoreconvenientifwerepresentitinastatetransitiondiagram:
AftergettingthisGraycode,wecanprocessthepulsesusingamicrocontroller.Thechannelpinsofthemotorhavetobeconnectedtotheinterruptpinsofthemicrocontroller.Sowhenthechannelhasedgetransitions,itwillgenerateaninterruptortriggerinthepins,andifanyinterruptsarrivesinthatpin,aninterruptserviceroutineorsimplyafunctionwillbeexecutedinsidethemicrocontrollerprogram.Itcanreadthecurrentstateofthetwopins.Accordingtothecurrentstateofpinsandpreviousvalues,wecandeterminethedirectionofrotationandcandecidewhetherwehavetoincrementordecrementthecount.Thisisthebasiclogicofencoderhandling.
Aftergettingthecount,wecancalculatetheangleofrotation(indegrees)usingAngle=(CountValue/CPR)*360.Here,ifwesubstituteCPRwith8400,theequationbecomesAngle=0.04285*CountValue,thatis,forturningonedegree,24countshavetobereceivedor6encodedchannelpulseshavetocome.
![Page 139: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/139.jpg)
ThefollowingfigureshowstheinterfacingcircuitofonemotorencoderwithTivaCLaunchPad:
InterfacingMotorencoderwithTivaCLaunchpad
Themaximumlevelofoutputpulseisbetween0Vto5Vfromtheencoder.Inthiscase,wecandirectlyinterfacetheencoderwithLaunchpadbecauseitcanreceiveinputofupto5V,orwecanusea3.3Vto5Vlevelshifterlikeweusedformotordriverinterfacingearlier.
Inthenextsection,wewillwriteacodeinEnergiatotestthequadratureencodersignal.Weneedtocheckwhetherwegetapropercountfromencoder.
QuadratureencoderinterfacingcodeThiscodewillprintthecountoftheleftandrightmotorencoderviaaserialport.Thetwoencodersarein2Xdecodingscheme,sowewillget4200CPR.Inthefirstsectionofthecode,wearedefiningpinsfortwochanneloutputsoftwoencodersandwearedeclaringthecountvariablefortwoencoders.Theencodervariableusesavolatilekeywordbeforethevariabledatatype.
ThemainuseofvolatileisthatthevariablewithvolatilekeywordwillbestoredintheRAM,whereasnormalvariablesareinCPUregisters.Encodervalueswillchangeveryquickly,sousinganordinaryvariablewillnotbeaccurate.Inordertogetaccuracy,wewillusevolatileforencodervariables,asfollows:
//Encoderpinsdefinition
![Page 140: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/140.jpg)
//Leftencoder
#defineLeft_Encoder_PinA31
#defineLeft_Encoder_PinB32
volatilelongLeft_Encoder_Ticks=0;
//Variabletoreadcurrentstateofleftencoderpin
volatileboolLeftEncoderBSet;
//RightEncoder
#defineRight_Encoder_PinA33
#defineRight_Encoder_PinB34
volatilelongRight_Encoder_Ticks=0;
//Variabletoreadcurrentstateofrightencoderpin
volatileboolRightEncoderBSet;
Thefollowingcodesnippetisthedefinitionofthesetup()function.Inwiringlanguage,setup()isabuilt-infunctionusedforinitializationandforone-timeexecutionofvariablesandfunctions.Insidesetup(),weinitializetheserialdatacommunicationwithabaudrateof115200andcallauser-definedfunctionSetupEncoders()toinitializepinsoftheencoders.Theserialdatacommunicationismainlydonetochecktheencodercountviatheserialterminal.
voidsetup()
{
//InitSerialportwith115200buadrate
Serial.begin(115200);
SetupEncoders();
}
ThedefinitionofSetupEncoders()isgiveninthecodethatfollows.Toreceivetheencoderpulse,weneedtwopinsinLaunchpadastheinput.ConfiguretheencoderpinstoLaunchpadastheinputandactivateitspull-upresistor.TheattachInterrupt()functionwillconfigureoneoftheencoderpinsasaninterrupt.TheattachInterrupt()functionhasthreearguments.Firstargumentisthepinnumber,secondargumentistheInterruptServiceRoutine(ISR),andthethirdargumentistheinterruptcondition,thatis,theconditioninwhichtheinterrupthastofireISR.Inthiscode,weareconfiguringPinAoftheleftandrightencoderpinsastheinterrupt;itcallstheISRwhenthereisariseinthepulse.
voidSetupEncoders()
{
//Quadratureencoders
//Leftencoder
pinMode(Left_Encoder_PinA,INPUT_PULLUP);
//setspinAasinput
pinMode(Left_Encoder_PinB,INPUT_PULLUP);
//setspinBasinput
attachInterrupt(Left_Encoder_PinA,do_Left_Encoder,RISING);
//Rightencoder
pinMode(Right_Encoder_PinA,INPUT_PULLUP);
//setspinAasinput
![Page 141: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/141.jpg)
pinMode(Right_Encoder_PinB,INPUT_PULLUP);
//setspinBasinput
attachInterrupt(Right_Encoder_PinA,do_Right_Encoder,RISING);
}
Thefollowingcodeisthebuilt-inloop()functioninwiringlanguage.Theloop()functionisaninfiniteloopwhereweputourmaincode.Inthiscode,wecalltheUpdate_Encoders()functiontoprinttheencodervaluecontinuouslythroughserialterminal.
voidloop()
{
Update_Encoders();
}
ThefollowingcodeisthefunctiondefinitionoftheUpdate_Encoders()function.Itprintstwoencodervaluesinalinewithastartingcharacter"e",andthevaluesareseparatedbytabspaces.TheSerial.print()functionisabuilt-infunctionthatwillprintthecharacter/stringgivenastheargument.
voidUpdate_Encoders()
{
Serial.print("e");
Serial.print("\t");
Serial.print(Left_Encoder_Ticks);
Serial.print("\t");
Serial.print(Right_Encoder_Ticks);
Serial.print("\n");
}
ThefollowingcodeistheISRdefinitionoftheleftandrightencoders.Whenarisingedgeisdetectedoneachofthepins,oneoftheISRswillbecalled.ThecurrentinterruptpinsarePinAofeachoftheencoders.Aftergettingtheinterrupt,wecanassumethattherisingPinAhasahighervaluestate,sothereisnoneedtoreadthatpin.ReadPinBofboththeencodersandstorethepinstatetoLeftEncoderBSetorRightEncoderBSet.ThecurrentstateiscomparedtothepreviousstateofPinBandcandetectthedirectionanddecidewhetherthecounthastobeincrementedordecrementedaccordingtothestatetransitiontable.
voiddo_Left_Encoder()
{
LeftEncoderBSet=digitalRead(Left_Encoder_PinB);
//readtheinputpin
Left_Encoder_Ticks-=LeftEncoderBSet?-1:+1;
}
voiddo_Right_Encoder()
{
RightEncoderBSet=digitalRead(Right_Encoder_PinB);
//readtheinputpin
Right_Encoder_Ticks+=RightEncoderBSet?-1:+1;
}
UploadthesketchandviewtheoutputusingtheserialmonitorinEnergia.NavigatetoTools|Serial
![Page 142: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/142.jpg)
monitor.Movethetwomotorsmanuallyandyoucanseethecountchanging.Setthebaudrateintheserialmonitor,whichisthesameasinitializedinthecode;inthiscase,itis115200.
Theoutputwilllooklikethis:
Ifwewanttoupgradetherobottohighaccuracyandpayload,wehavetothinkabouthighqualityactuatorssuchasDynamixel.Dynamixelsareintelligentactuators,whichhavein-builtPIDcontrolandmonitoringoftheservoandencoderparameters,suchastorque,position,andsoon.
![Page 143: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/143.jpg)
WorkingwithDynamixelactuatorsDynamixelisakindofnetworkedactuatorforrobotsdevelopedbyKoreanmanufactureROBOTIS.Itiswidelyusedbycompanies,universities,andhobbyistsduetoitsversatileexpansioncapability,powerfeedbackfunction,position,speed,internaltemperature,inputvoltage,andsoon.
TheDynamixelservoscanbeconnectedinadaisychain;itisamethodofconnectingdeviceinaserialfashion,thatis,connectingonedevicetoanotherthroughtheconnecteddevices,andcancontrolalltheconnectedservosfromonecontroller.DynamixelservoscommunicateviaRS485orTTL.ThelistofavailableDynamixelservosisgivenathttp://www.robotis.com/xe/dynamixel_en.
TheinterfacingofDynamixelisveryeasy.DynamixelcomeswithacontrollercalledUSB2Dyanmixel,whichwillconvertUSBtoDynamixelcompatibleTTL/RS485levels.ThefollowingfigureshowstheinterfacingdiagramofDynamixel:
ROBOTISprovidesDynamixelSDKforaccessingmotorregisters;wecanreadandwritevaluestoDynamixelregistersandretrievedatasuchasposition,temperature,voltage,andsoon.
Note
TheinstructionstosetUSB2DynamixelandDynamixelSDKaregivenatsupport.robotis.com/en/.
DynamixelcanbeprogramedusingPythonlibraries.OneofthePythonlibrariesforhandlingDynamixelservosispydynamixel.ThispackageisavailableforWindowsandLinux.PydynamixelwillsupportRX,MX,andEXseriesservos.
WecandownloadthepydynamixelPythonpackagefromhttps://pypi.python.org/pypi/dynamixel/.
Downloadthepackageandextractittothehomefolder.Openaterminal/DOSpromptandexecutethefollowingcommand:
sudopythonsetup.pyinstall
Afterinstallingthepackage,wecantrythefollowingPythonexample,whichwilldetecttheservoattachedtotheUSB2Dynamixelandwritesomerandompositiontotheservo.ThisexamplewillworkwithRXandMXservos.
#!/usr/bin/envpython
![Page 144: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/144.jpg)
ThefollowingcodewillimportthenecessaryPythonmodulesrequiredforthisexample.ThisincludesDynamixelPythonmodulestoo:
importos
importdynamixel
importtime
importrandom
ThefollowingcodedefinesthemainparametersneededforDynamixelcommunicationparameters.ThenServosvariabledenotednumberofDynamixelservosconnectedtothebus.TheportNamevariableindicatestheserialportofUSB2DynamixeltowhichDynamixelservosareconnected.ThebaudRatevariableisthecommunicationspeedofUSB2DynamixelandDynamixel.
#ThenumberofDynamixelsonourbus.
nServos=1
#Setyourserialportaccordingly.
ifos.name=="posix":
portName="/dev/ttyUSB0"
else:
portName="COM6"
#DefaultbaudrateoftheUSB2Dynamixeldevice.
baudRate=1000000
ThefollowingcodeistheDynamixelPythonfunctiontoconnecttoDynamixelservos.Ifitisconnected,theprogramwillprintitandscanthecommunicationbustofindthenumberofservosstartingfromID1to255.TheServoIDistheidentificationofeachservo.WearegivennServosas1,soitwillstopscanningaftergettingoneservoonthebus:
#Connecttotheserialport
print"Connectingtoserialport",portName,'...',
serial=dynamixel.serial_stream.SerialStream(port=portName,baudrate=baudRate,
timeout=1)
print"Connected!"
net=dynamixel.dynamixel_network.DynamixelNetwork(serial)
net.scan(1,nServos)
ThefollowingcodewillappendtheDynamixelIDandtheservoobjecttothemyActuatorslist.Wecanwriteservovaluestoeachservousingservoidandservoobject.WecanusethemyActuatorslistforfurtherprocessing:
#Alisttoholdthedynamixels
myActuators=list()
printmyActuators
Thiswillcreatealistforstoringdynamixelactuatorsdetails.
print"ScanningforDynamixels...",
fordyninnet.get_dynamixels():
printdyn.id,
myActuators.append(net[dyn.id])
print"...Done"
![Page 145: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/145.jpg)
Thefollowingcodewillwriterandompositionsfrom450to600toeachDynamixelactuatorthatisavailableonthebus.TherangeofpositionsinDynamixelis0to1023.Thiswillsettheservoparameterssuchasspeed,torque,torque_limt,max_torque,andsoon:
#Setthedefaultspeedandtorque
foractuatorinmyActuators:
actuator.moving_speed=50
actuator.synchronized=True
actuator.torque_enable=True
actuator.torque_limit=800
actuator.max_torque=800
Thefollowingcodewillprintthecurrentpositionofthecurrentactuator:
#Movetheservosrandomlyandprintouttheircurrentpositions
whileTrue:
foractuatorinmyActuators:
actuator.goal_position=random.randrange(450,600)
net.synchronize()
Thefollowingcodewillreadalldatafromactuators:
foractuatorinmyActuators:
actuator.read_all()
time.sleep(0.01)
foractuatorinmyActuators:
printactuator.cache[dynamixel.defs.REGISTER['Id']],
actuator.cache[dynamixel.defs.REGISTER['CurrentPosition']]
time.sleep(2)
![Page 146: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/146.jpg)
Questions1. WhatistheH-Bridgecircuit?2. Whatisaquadratureencoder?3. Whatisthe4Xencodingscheme?4. Howdowecalculatedisplacementfromencoderdata?5. WhatarethefeaturesoftheDynamixelactuator?
![Page 147: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/147.jpg)
SummaryInthischapterwehavediscussedtheinterfacingofmotorthatweareusinginourrobot.WehaveseenmotorandencoderinterfacingwithacontrollerboardcalledTivaCLaunchPad.Wehavediscussedthecontrollercodeforinterfacingmotorandencoder.Inthefuture,iftherobotrequireshighaccuracyandtorque,wehaveseenDynamixelservosthatcansubstitutecurrentDCmotors.Inthenextchapter,wewillseedifferentkindsofsensorsthatcanbeusedinrobotsanditsinterfacing.
![Page 148: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/148.jpg)
Chapter6.WorkingwithRoboticSensorsInthepreviouschapter,wehaveseentheinterfacingofsomeactuatorsforourservicerobot.Thenextimportantsectionthatweneedtocoverisabouttheroboticsensorsusedinthisrobot.
Weareusingsensorsinthisrobottofindthedistancefromanobstacle,togettherobotodometrydata,andforroboticvisionandacoustics.
Thesensorsareultrasonicdistancesensors,orIRproximitysensorsareusedtodetecttheobstaclesandtoavoidcollisions.ThevisionsensorssuchasKinecttoacquire3Ddataoftheenvironment,forvisualodometry;objectdetection,forcollisionavoidance;andaudiodevicessuchasspeakersandmics,forspeechrecognitionandsynthesis.
Inthischapter,wearenotincludingvisionandaudiosensorsinterfacingbecauseintheupcomingchapterwewilldiscussthemandtheirinterfacingindetail.
WorkingwithultrasonicdistancesensorsOneofthemostimportantfeaturesofamobilerobotisnavigation.Anidealnavigationmeansarobotcanplanitspathfromitscurrentpositiontothedestinationandcanmovewithoutanyobstacles.Weuseultrasonicdistancesensorsinthisrobotfordetectingobjectsincloseproximitythatcan'tbedetectedusingtheKinectsensor.AcombinationofKinectandultrasonicsoundsensorsprovidesidealcollisionavoidanceforthisrobot.
Ultrasonicdistancesensorsworkinthefollowingmanner.Thetransmitterwillsendanultrasonicsoundwhichisnotaudibletohumanears.Aftersendinganultrasonicwave,itwillwaitforanechoofthetransmittedwave.Ifthereisnoecho,itmeanstherearenoobstaclesinfrontoftherobot.Ifthereceivingsensorreceivesanecho,apulsewillbegeneratedonthereceiver,anditcancalculatethetotaltimethewavewilltaketotraveltotheobjectandreturntothereceiversensors.Ifwegetthistime,wecancomputethedistancetotheobstacleusingthefollowingformula:
SpeedofSound*TimePassed/2=DistancefromObject.
Here,thespeedofsoundcanbetakenas340m/s.
Mostoftheultrasonicrangesensorshaveadistancerangefrom2cmto400cm.Inthisrobot,weuseasensormodulecalledHC-SR04.WecanseehowtointerfaceHC-SR04withTivaCLaunchPadtogetthedistancefromtheobstacles.
InterfacingHC-SR04toTivaCLaunchPadThefollowingfigureistheinterfacingcircuitoftheHC-SR04ultrasonicsoundsensorwithTivaCLaunchPad:
![Page 149: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/149.jpg)
InterfacingdiagramofLaunchpadandHC-SR04
Theworkingvoltageoftheultrasonicsensoris5Vandtheinput/outputofthissensorisalso5Volt,soweneedalevelshifterontheTrigandEchopinsfortheinterfacingintothe3.3VlevelLaunchpad.Inthelevelshifter,weneedtoapplyhighvoltage,thatis,5Volt,andlowvoltage,thatis,3.3Volt,asshowninthefigure,toswitchfromoneleveltoanotherlevel.TrigandEchopinsareconnectedonthehighvoltagesideofthelevelshifterandthelowvoltagesidepinsareconnectedtoLaunchpad.TheTrigpinandEchopinareconnectedtothe10thand9thpinsofLaunchpad.Afterinterfacingthesensor,wecanseehowtoprogramthetwoI/Opins.
WorkingofHC-SR04
Thetimingdiagramofwaveformoneachpinisshowninthefollowingdiagram.Weneedtoapplyashort10µspulsetothetriggerinputtostarttherangingandthenthemodulewillsendoutaneightcycleburstofultrasoundat40KHzandraiseitsecho.Theechoisadistanceobjectthatispulsewidthandtherangeinproportion.Youcancalculatetherangethroughthetimeintervalbetweensendingtriggersignalsandreceivingechosignalsusingthefollowingformula:
Range=highleveltimeofechopinoutput*velocity(340M/S)/2.
Itwillbebettertouseadelayof60msbeforeeachtrigger,toavoidoverlappingbetweenthetriggerandecho:
![Page 150: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/150.jpg)
InterfacingcodeofTivaCLaunchPad
ThefollowingEnergiacodeforLaunchpadreadsvaluesfromtheultrasoundsensorandmonitorsthevaluesthroughaserialport.
ThefollowingcodedefinesthepinsinLaunchpadtohandleultrasonicechoandtriggerpinsandalsodefinesvariablesforthedurationofthepulseandthedistanceincentimeters:
constintecho=9,Trig=10;
longduration,cm;
Thefollowingcodesnippetisthesetup()function.Thesetup()functioniscalledwhenasketch/codestarts.Usethistoinitializevariables,pinmodes,startusinglibraries,andsoon.Thesetupfunctionwillonlyrunonce,aftereachpoweruporresetoftheLaunchpadboard.Insidesetup(),weinitializeserialcommunicationwithabaudrateof115200andsetupthemodeofultrasonichandlingpinsbycallingafunctionSetupUltrasonic();
voidsetup()
{
//InitSerialportwith115200buadrate
Serial.begin(115200);
SetupUltrasonic();
}
Thefollowingisthesetupfunctionfortheultrasonicsensor;itwillconfiguretheTriggerpinasOUTPUTandtheEchopinasINPUT.ThepinMode()functionisusedtosetthepinasINPUTorOUTPUT.
voidSetupUltrasonic()
{
pinMode(Trig,OUTPUT);
pinMode(echo,INPUT);
}
![Page 151: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/151.jpg)
Aftercreatingasetup()function,whichinitializesandsetstheinitialvalues,theloop()functiondoespreciselywhatitsnamesuggests,andloopsconsecutively,allowingyourprogramtochangeandrespond.UseittoactivelycontroltheLaunchpadboard.
Themainloopofthisisinthefollowingcode.ThisfunctionisaninfiniteloopandcallstheUpdate_Ultra_Sonic()functiontoupdateandprinttheultrasonicreadingsthroughaserialport:
voidloop()
{
Update_Ultra_Sonic();
delay(200);
}
ThefollowingcodeisthedefinitionoftheUpdate_Ultra_Sonic()function.Thisfunctionwilldothefollowingoperations.First,itwilltakethetriggerpintotheLOWstatefor2microsecondsandHIGHfor10microseconds.After10microseconds,itwillagainreturnthepintotheLOWstate.Thisisaccordingtothetimingdiagram.Wealreadysawthat10µsisthetriggerpulsewidth.
Aftertriggeringwith10µs,wehavetoreadthetimedurationfromtheEchopin.Thetimedurationisthetimetakenforthesoundtotravelfromthesensortotheobjectandfromtheobjecttothesensorreceiver.WecanreadthepulsedurationbyusingthepulseIn()function.Aftergettingthetimeduration,wecanconvertthetimeintocentimetersbyusingthemicrosecondsToCentimeters()function,asshowninthefollowingcode:
voidUpdate_Ultra_Sonic()
{
digitalWrite(Trig,LOW);
delayMicroseconds(2);
digitalWrite(Trig,HIGH);
delayMicroseconds(10);
digitalWrite(Trig,LOW);
duration=pulseIn(echo,HIGH);
//convertthetimeintoadistance
cm=microsecondsToCentimeters(duration);
//Sendingthroughserialport
Serial.print("distance=");
Serial.print("\t");
Serial.print(cm);
Serial.print("\n");
}
Thefollowingcodeistheconversionfunctionfrommicrosecondstodistanceincentimeters.Thespeedofsoundis340m/s,thatis,29microsecondspercentimeter.Sowegetthetotaldistancebydividingthetotalmicrosecondsby29/2:
longmicrosecondsToCentimeters(longmicroseconds)
{
returnmicroseconds/29/2;
}
![Page 152: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/152.jpg)
Afteruploadingthecode,opentheserialmonitorfromtheEnergiamenuunderTools|SerialMonitorandchangethebaudrateinto115200.Youcanseethevaluesfromtheultrasonicsensor,likethis:
Outputoftheenergiaserialmonitor
InterfacingTivaCLaunchPadwithPython
Inthissection,wecanseehowtoconnectTivaCLaunchPadwithPythontoreceivedatafromLaunchpad.
ThePySerialmodulecanbeusedforinterfacingLaunchpadtoPython.ThedetaileddocumentationofPySerialanditsinstallationprocedureforWindow,Linux,andOSXisonthefollowinglink:
http://pyserial.sourceforge.net/pyserial.html
PySerialisavailableintheUbuntupackagemanageranditcanbeeasilyinstalledinUbuntuusingthefollowingcommandinterminal:
$sudoapt-getinstallpython-serial
Afterinstallingthepython-serialpackage,wecanwriteapythoncodetointerfaceLaunchpad.Theinterfacingcodeisgiveninfollowingsection.
Thefollowingcodeimportsthepythonserialmoduleandthesysmodule.TheserialmodulehandlestheserialportsofLaunchpadandperformsoperationssuchasreading,writing,andsoon.Thesysmoduleprovidesaccesstosomevariablesusedormaintainedbytheinterpreterandtofunctionsthatinteractstronglywiththeinterpreter.Itisalwaysavailable:
#!/usr/bin/envpython
![Page 153: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/153.jpg)
importserial
importsys
WhenweplugLaunchpadtothecomputer,thedeviceregistersontheOSasavirtualserialport.InUbuntu,thedevicenamelookslike/dev/ttyACMx.Wherexcanbeanumber,ifthereisonlyonedevice,itwillprobablybe0.TointeractwiththeLaunchpad,weneedtohandlethisdevicefileonly.
Thefollowingcodewilltrytoopentheserialport/dev/ttyACM0ofLaunchpadwithabaudrateof115200.Ifitfails,itwillprintUnabletoopenserialport.
try:
ser=serial.Serial('/dev/ttyACM0',115200)
except:
print"Unabletoopenserialport"
Thefollowingcodewillreadtheserialdatauntiltheserialcharacterbecomesanewline('\n')andprintsitontheterminal.IfwepressCtrl+Conthekeyboard,toquittheprogram,itwillexitbycallingsys.exit(0).
whileTrue:
try:
line=ser.readline()
printline
except:
print"Unabletoreadfromdevice"
sys.exit(0)
Aftersavingthefile,changethepermissionofthefiletoexecutableusingthefollowingcommand:
$sudochmod+Xscript_name
$./script_name
Theoutputofthescriptwilllooklikethis:
![Page 154: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/154.jpg)
WorkingwiththeIRproximitysensorInfraredsensorsareanothermethodtofindobstaclesandthedistancefromtherobot.Theprincipleofinfrareddistancesensorsisbasedontheinfraredlightthatisreflectedfromasurfacewhenhittinganobstacle.AnIRreceiverwillcapturethereflectedlightandthevoltageismeasuredbasedontheamountoflightreceived.
OneofthepopularIRrangesensorsisSharpGP2D12,theproductlinkisasfollows:
http://www.robotshop.com/en/sharp-gp2y0a21yk0f-ir-range-sensor.html
ThefollowingfigureshowstheSharpGP2D12sensor:
ThesensorsendsoutabeamofIRlightandusestriangulationtomeasurethedistance.ThedetectionrangeoftheGP2D12isbetween10cmand80cm.Thebeamis6cmwideatadistanceof80cm.ThetransmissionandreflectionoftheIRlightsensorisillustratedinthefollowingfigure:
![Page 155: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/155.jpg)
OntheleftofthesensorisanIRtransmitter,whichcontinuouslysendsIRradiation,afterhittingintosomeobjects,theIRlightwillreflectanditwillbereceivedbytheIRreceiver.TheinterfacingcircuitoftheIRsensorisshownhere:
![Page 156: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/156.jpg)
TheanalogoutpinVocanbeconnectedtotheADCpinofLaunchpad.TheinterfacingcodeoftheSharpdistancesensorwiththeTivaCLaunchpadisgivenfurtherinthissection.Inthiscode,weselectthe18thpinofLaunchpadandsetittotheADCmodeandreadthevoltagelevelsfromtheSharpdistancesensor.TherangeequationoftheGP2D12IRsensorisgivenasfollows:
Range=(6787/(Volt-3))–4
Here,VoltistheanalogvoltagevaluefromADCoftheVoutpin.
Inthisfirstsectionofthecode,wesetthe18thpinofTivaCLaunchPadastheinputpinandstartaserialcommunicationatabaudrateof115200:
intIR_SENSOR=18;//SensorisconnectedtotheanalogA3
intintSensorResult=0;//Sensorresult
floatfltSensorCalc=0;//Calculatedvalue
voidsetup()
{
Serial.begin(115200);//Setupcommunicationwithcomputertopresentresults
serialmonitor
}
![Page 157: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/157.jpg)
Inthefollowingsectionofcode,thecontrollercontinuouslyreadstheanalogpinandconvertsittothedistancevalueincentimeters:
voidloop()
{
//readthevaluefromtheirsensor
intSensorResult=analogRead(IR_SENSOR);//Getsensorvalue
//Calculatedistanceincmaccordingtotherangeequation
fltSensorCalc=(6787.0/(intSensorResult-3.0))-4.0;
Serial.print(fltSensorCalc);//Senddistancetocomputer
Serial.println("cm");//Addcmtoresult
delay(200);//Wait
}
ThisisthebasiccodetointerfaceaSharpdistancesensor.TherearesomedrawbackswiththeIRsensors.Someofthemareasfollows:
Wecan'tusethemindirectorindirectsunlight,soit'sdifficulttousetheminanoutdoorrobotTheymaynotworkifanobjectisreflectiveTherangeequationonlyworkswithintherange
Inthenextsection,wecandiscussIMUanditsinterfacingwithTivaCLaunchPad.IMUcangivetheodometrydataanditcanbeusedastheinputtonavigationalgorithms.
![Page 158: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/158.jpg)
WorkingwithInertialMeasurementUnitAnInertialMeasurementUnit(IMU)isanelectronicdevicethatmeasuresvelocity,orientation,andgravitationalforcesusingacombinationofaccelerometers,gyroscopes,andmagnetometers.AnIMUhasalotofapplicationsinrobotics;someoftheapplicationsareinbalancingofUnmannedAerialVehicles(UAVs)androbotnavigation.
Inthissection,wediscusstheroleofIMUinmobilerobotnavigationandsomeofthelatestIMUsonthemarketanditsinterfacingwithLaunchpad.
InertialNavigationAnIMUprovidesaccelerationandorientationrelativetoinertialspace,ifyouknowtheinitialposition,velocity,andorientation,youcancalculatethevelocitybyintegratingthesensedaccelerationandthesecondintegrationgivestheposition.Togetthecorrectdirectionoftherobot,theorientationoftherobotisrequired;thiscanbeobtainedbyintegratingsensedangularvelocityfromgyroscope.
Thefollowingfigureillustratesaninertialnavigationsystem,whichwillconvertIMUvaluestoodometricdata:
ThevalueswegetfromtheIMUareconvertedintonavigationalinformationusingnavigationequationsandfeedingintoestimationfilterssuchastheKalmanfilter.TheKalmanfilterisanalgorithmthatestimatesthestateofasystemfromthemeasureddata(http://en.wikipedia.org/wiki/Kalman_filter).ThedatafromInertialNavigationSystem(INS)willhavesomedriftbecauseoftheerrorfromtheaccelerometerandgyroscope.Tolimitthedrift,anINSisusuallyaidedbyothersensorsthatprovidedirectmeasurementsoftheintegratedquantities.Basedonthemeasurementsandsensorerrormodels,theKalmanfilterestimateserrorsinthenavigationequationsandallthecoloredsensors'errors.ThefollowingfigureshowsadiagramofanaidedinertialnavigationsystemusingtheKalmanfilter:
![Page 159: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/159.jpg)
Alongwiththemotorencoders,thevaluefromtheIMUcanbetakenastheodometervalueanditcanbeusedfordeadreckoning,theprocessoffindingthecurrentpositionofamovingobjectbyusingapreviouslydeterminedposition.
Inthenextsection,wearegoingtoseeoneofthemostpopularIMUsfromInvenSensecalledMPU6050.
InterfacingMPU6050withTivaCLaunchPadTheMPU-6000/MPU-6050familyofpartsaretheworld'sfirstandonly6-axismotiontrackingdevicesdesignedforthelowpower,lowcost,andhighperformancerequirementsofsmartphones,tablets,wearablesensors,androbotics.
TheMPU-6000/6050devicescombinea3-axisgyroscopeand3-axisaccelerometeronthesilicondietogetherwithanonboarddigitalmotionprocessorcapableofprocessingcomplex9-axismotionfusionalgorithms.ThefollowingfigureshowsthesystemdiagramofMPU6050andbreakoutofMPU6050:
![Page 160: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/160.jpg)
ThebreakoutboardofMPU6050isshowninthefollowingfigureanditcanbepurchasedfromthefollowinglink:
https://www.sparkfun.com/products/110286
TheconnectionfromLaunchpadtoMPU6050isgiveninthefollowingtable.Theremainingpinscanbeleftdisconnected:
Launchpadpins MPU6050pins
+3.3V VCC/VDD
GND GND
![Page 161: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/161.jpg)
PD0 SCL
PD1 SDA
ThefollowingfigureshowstheinterfacingdiagramofMPU6050andTivaCLaunchpad:
TheMPU6050andLaunchpadcommunicateusingtheI2Cprotocol,thesupplyvoltageis3.3VoltanditistakenfromLaunchpad.
SettinguptheMPU6050libraryinEnergia
TheinterfacingcodeofEnergiaisdiscussedinthissection.Theinterfacingcodeusesthehttps://github.com/jrowberg/i2cdevlib/zipball/masterlibraryforinterfacingMPU6050.
DownloadtheZIPfilefromtheprecedinglinkandnavigatetoPreferencefromFile|PreferenceinEnergia,asshowninthefollowingscreenshot:
![Page 162: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/162.jpg)
GotoSketchbooklocationunderPreferences,asseenintheprecedingscreenshot,andcreateafoldercalledlibraries.ExtractthefilesinsidetheArduinofolderinsidetheZIPfiletothesketchbook/librarieslocation.TheArduinopackagesinthisrepositoryarealsocompatiblewithLaunchpad.TheextractedfilescontaintheI2Cdev,Wire,andMPU6050packagesthatarerequiredfortheinterfacingoftheMPU6050sensor.Thereareothersensorspackagesthatarepresentinthelibrariesfolderbutwearenotusingthemnow.
TheprecedingprocedureisdoneinUbuntu,butitisthesameforWindowsandMacOSX.
InterfacingcodeofEnergiaThiscodeisusedtoreadtherawvaluefromMPU6050toLaunchpad,itusesaMPU6050third-partylibrarythatiscompatiblewithEnergiaIDE.Thefollowingaretheexplanationsofeachblockofthecode.
Inthisfirstsectionofcode,weincludethenecessaryheadersforinterfacingMPU6050toLaunchpadsuchas12C,WireandtheMPU6050libraryandcreateanobjectofMPU6050withthenameaccelgyro.TheMPU6050.hlibrarycontainsaclassnamedMPU6050tosendandreceivedatatoandfromthesensor:
#include"Wire.h"
#include"I2Cdev.h"
![Page 163: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/163.jpg)
#include"MPU6050.h"
MPU6050accelgyro;
Inthefollowingsection,westarttheI2CandserialcommunicationtocommunicatewithMPU6050andprintsensorvaluesthroughtheserialport.Theserialcommunicationbaudrateis115200andSetup_MPU6050()isthecustomfunctiontoinitializetheMPU6050communication:
voidsetup()
{
//InitSerialportwith115200buadrate
Serial.begin(115200);
Setup_MPU6050();
}
ThefollowingsectionisthedefinitionoftheSetup_MPU6050()function.TheWirelibraryallowsyoutocommunicatewiththeI2Cdevices.MPU6050cancommunicateusingI2C.TheWire.begin()functionwillstarttheI2CcommunicationbetweenMPU6050andLaunchpad;also,itwillinitializetheMPU6050deviceusingtheinitialize()methoddefinedintheMPU6050class.Ifeverythingissuccessful,itwillprintconnectionsuccessful,otherwiseitwillprintconnectionfailed:
voidSetup_MPU6050()
{
Wire.begin();
//initializedevice
Serial.println("InitializingI2Cdevices...");
accelgyro.initialize();
//verifyconnection
Serial.println("Testingdeviceconnections...");
Serial.println(accelgyro.testConnection()?"MPU6050connectionsuccessful":
"MPU6050connectionfailed");
}
Thefollowingcodeistheloop()function,whichcontinuouslyreadsthesensorvalueandprintsitsvaluesthroughtheserialport:TheUpdate_MPU6050()customfunctionisresponsibleforprintingtheupdatedvaluefromMPU6050:
voidloop()
{
//UpdateMPU6050
Update_MPU6050();
}
ThedefinitionofUpdate_MPU6050()isgivenasfollows.Itdeclaressixvariablestohandletheaccelerometerandgyroscopevaluein3-axis.ThegetMotion6()functionintheMPU6050classisresponsibleforreadingthenewvaluesfromthesensor.Afterreading,itwillprintviatheserialport:
voidUpdate_MPU6050()
![Page 164: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/164.jpg)
{
int16_tax,ay,az;
int16_tgx,gy,gz;
//readrawaccel/gyromeasurementsfromdevice
accelgyro.getMotion6(&ax,&ay,&az,&gx,&gy,&gz);
//displaytab-separatedaccel/gyrox/y/zvalues
Serial.print("i");Serial.print("\t");
Serial.print(ax);Serial.print("\t");
Serial.print(ay);Serial.print("\t");
Serial.print(az);Serial.print("\t");
Serial.print(gx);Serial.print("\t");
Serial.print(gy);Serial.print("\t");
Serial.println(gz);
Serial.print("\n");
}
Theoutputfromtheserialmonitorisshownhere:
Wecanreadthesevaluesusingthepythoncodethatweusedforultrasonic.Thefollowingisthescreenshotoftheterminalwhenwerunthepythonscript:
![Page 165: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/165.jpg)
![Page 166: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/166.jpg)
InterfacingMPU6050toLaunchpadwiththeDMPsupportusingEnergiaInthissection,wewillseetheinterfacingcodeofMPU6050byactivatingDMP,whichcangiveusdirectorientationvaluesinquaternionoryaw,pitch,androll.Thisvaluecanbedirectlyappliedtoourroboticapplicationtoo.
ThefollowingsectionofcodeimportsallthenecessaryheaderfilestointerfaceandcreateanMPU6050objectlikethepreviouscode:
#include"Wire.h"
#include"I2Cdev.h"
#include"MPU6050_6Axis_MotionApps20.h"
//CreatingMPU6050Object
MPU6050accelgyro(0x68);
ThefollowingcodeinitializesanddeclaresvariablestohandleDMP:
//DMPoptions
//SettrueifDMPinitializationwassuccessful
booldmpReady=false;
//HoldsactualinterruptstatusbytefromMPU
uint8_tmpuIntStatus;
//returnstatusaftereachdeviceoperation
uint8_tdevStatus;
//ExpectedDMPpacketsize
uint16_tpacketSize;
//countofallbytescurrentlyinFIFO
uint16_tfifoCount;
//FIFOstoratebuffer
uint8_tfifoBuffer[64];
//Outputformatwillbeinquaternion
#defineOUTPUT_READABLE_QUATERNION
Thefollowingcodedeclaresvariousvariablestohandleorientationvariables:
//quaternionvariable
Quaternionq;
Thefollowingfunctionisaninterruptserviceroutine,whichiscalledwhenMPU6050INT pingeneratesaninterrupt:
//InterruptdetectionroutineforDMPhandling
volatileboolmpuInterrupt=false;
//indicateswhetherMPUinterruptpinhasgonehigh
![Page 167: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/167.jpg)
voiddmpDataReady(){
mpuInterrupt=true;
}
Thefollowingcodeisthedefinitionofthesetup()function.Itinitializestheserialportwithabaudrateof115200andcallstheSetup_MPU6050()function:
voidsetup()
{
//InitSerialportwith115200buadrate
Serial.begin(115200);
Setup_MPU6050();
}
ThefollowingcodeisthedefinitionoftheSetup_MPU6050()function.ItwillinitializeMPU6050andcheckswhetherit'sinitializedornot.Ifit'sinitialized,itwillinitializeDMPbycallingtheSetup_MPU6050_DMP()function:
voidSetup_MPU6050()
{
Wire.begin();
//initializedevice
Serial.println("InitializingI2Cdevices...");
accelgyro.initialize();
//verifyconnection
Serial.println("Testingdeviceconnections...");
Serial.println(accelgyro.testConnection()?"MPU6050connectionsuccessful":
"MPU6050connectionfailed");
//InitializeDMPinMPU6050
Setup_MPU6050_DMP();
}
ThefollowingcodeisthedefinitionoftheSetup_MPU6050_DMP()function.ItinitializesDMPandsetsoffsetinthreeaxis.IfDMPisinitialized,itwillstartfunctioningandconfigurethePF_0/PUSH2pinasaninterrupt.WhenthedataisreadyontheMPU6050buffer,aninterruptwillbegenerated,whichwillreadvaluesfromthebus:
//SetupMPU6050DMP
voidSetup_MPU6050_DMP()
{
//DMPInitialization
devStatus=accelgyro.dmpInitialize();
accelgyro.setXGyroOffset(220);
accelgyro.setXGyroOffset(76);
accelgyro.setXGyroOffset(-85);
accelgyro.setXGyroOffset(1788);
if(devStatus==0){
accelgyro.setDMPEnabled(true);
pinMode(PUSH2,INPUT_PULLUP);
attachInterrupt(PUSH2,dmpDataReady,RISING);
![Page 168: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/168.jpg)
mpuIntStatus=accelgyro.getIntStatus();
dmpReady=true;
packetSize=accelgyro.dmpGetFIFOPacketSize();
}
else{
//Donothing
;
}
}
Thefollowingcodeisthedefinitiontheoftheloop()function.ItwillcallUpdate_MPU6050(),whichwillreadbuffervaluesandprintitontheserialterminal:
voidloop()
{
//UpdateMPU6050
Update_MPU6050();
}
ThisisthedefinitionofUpdate_MPU6050(),whichwillcalltheUpdate_MPU6050_DMP()function:
voidUpdate_MPU6050()
{
Update_MPU6050_DMP();
}
ThefollowingfunctionreadsfromtheFIFOregisterofMPU6050andthequaternionvaluegetsprintedontheserialterminal:
//UpdateMPU6050DMPfunctions
voidUpdate_MPU6050_DMP()
{
//DMPProcessing
if(!dmpReady)return;
while(!mpuInterrupt&&fifoCount<packetSize)
{
;
}
mpuInterrupt=false;
mpuIntStatus=accelgyro.getIntStatus();
//getcurrentFIFOcount
fifoCount=accelgyro.getFIFOCount();
if((mpuIntStatus&0x10)||fifoCount>512){
//resetsowecancontinuecleanly
accelgyro.resetFIFO();
}
elseif(mpuIntStatus&0x02){
//waitforcorrectavailabledatalength,shouldbeaVERYshortwait
![Page 169: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/169.jpg)
while(fifoCount<packetSize)fifoCount=accelgyro.getFIFOCount();
//readapacketfromFIFO
accelgyro.getFIFOBytes(fifoBuffer,packetSize);
//trackFIFOcounthereincasethereis>1packetavailable
//(thisletsusimmediatelyreadmorewithoutwaitingforaninterrupt)
fifoCount-=packetSize;
#ifdefOUTPUT_READABLE_QUATERNION
//displayquaternionvaluesineasymatrixform:wxyz
accelgyro.dmpGetQuaternion(&q,fifoBuffer);
Serial.print("i");Serial.print("\t");
Serial.print(q.x);Serial.print("\t");
Serial.print(q.y);Serial.print("\t");
Serial.print(q.z);Serial.print("\t");
Serial.print(q.w);
Serial.print("\n");
#endif
}
}
Theoutputfromtheserialmonitorisshowninthefollowingscreenshot.Theserialmonitorshowsthequaternionvaluesofx,y,z,andwstartingwithan"i"character:
WecanalsousethePythonscripttoviewthesevalues.TheoutputofthePythonscriptisshowninthefollowingscreenshot:
![Page 170: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/170.jpg)
Inthenextchapters,wewillseesomeofthevisionandaudiosensorsthatcanbeusedonthisrobotanditsinterfacingwithPython.
![Page 171: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/171.jpg)
Questions1. Whatareultrasonicsensorsandhowdotheywork?2. Howdoyoucalculatedistancefromtheultrasonicsensor?3. WhatistheIRproximitysensorandhowdoesitwork?4. HowdoyoucalculatedistancefromtheIRsensor?5. WhatisIMUandhowdoyougettheodometricdata?6. WhatistheAidedInertialNavigationsystem?7. WhatarethemainfeaturesofMPU6050?
![Page 172: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/172.jpg)
SummaryInthischapter,wehaveseensomeroboticsensors,whichcanbeusedinourrobot.Thesensorswediscussedareultrasonicdistancesensors,IRproximitysensors,andIMUs.Thesethreesensorshelpinthenavigationoftherobot.WealsodiscussedthebasiccodetointerfacethesesensorstoTivaCLaunchPad.WewillseemoreonvisionandaudiosensorsinterfacingusingPythoninthenextchapter.
![Page 173: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/173.jpg)
Chapter7.ProgrammingVisionSensorsUsingPythonandROSInthepreviouschapter,wehaveseensomeoftheroboticsensorsusedinourrobotanditsinterfacingwiththeLaunchpadboard.Inthischapter,wewillmainlydiscussvisionsensorsanditsinterfaceusedinourrobot.
Therobotwearedesigningwillhavea3DsensorandwecaninterfaceitwithvisionlibrariessuchasOpenCV,OpenNI,andPointCloudLibrary(PCL).Someoftheapplicationsofthe3Dvisionsensorinourrobotareautonomousnavigation,obstacleavoidance,objectdetection,peopletracking,andsoon.
WewillalsodiscusstheinterfacingofvisionsensorsandimageprocessinglibrarieswithROS.Inthelastsectionofthechapter,wewillseeanavigationalalgorithmforourrobotcalledSLAM(SimultaneousLocalizationandMapping)anditsimplementationusinga3Dsensor,ROS,andimageprocessinglibraries.
Inthefirstsection,wewillseesome2Dand3Dvisionsensorsavailableonthemarketthatwewilluseinourrobot.
ListofroboticvisionsensorsandimageprocessinglibrariesA2Dvisionsensororanordinarycameradelivers2Dimageframesofthesurroundings,whereasa3Dvisionsensordelivers2Dimageframesandanadditionalparametercalleddepthofeachimagepoint.Wecanfindthex,y,andzdistanceofeachpointfromthe3Dsensorwithrespecttothesensoraxis.
Therearequiteafewvisionsensorsavailableonthemarket.Someofthe2Dand3Dvisionsensorsthatcanbeusedinourrobotarementionedinthischapter.
Thefollowingfigureshowsthelatest2DvisionsensorcalledPixy/CMUcam5(http://www.cmucam.org/),whichisabletodetectcolorobjectswithhighspeedandaccuracyandcanbeinterfacedtoanArduinoboard.Pixycanbeusedforfastobjectdetectionandtheusercanteachwhichobjectitneedstotrack.PixymodulehasaCMOSsensorandNXP(http://www.nxp.com/)processorforimageprocessing:
![Page 174: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/174.jpg)
Pixy/CMUCam5
Thecommonlyavailable2Dvisionsensorsarewebcams.TheycontainaCMOSsensorandUSBinterface,butthereisnoinbuiltprocessingfortheobjectdetection.ThefollowingimageshowsapopularwebcamfromLogitechthatcancapturepicturesofupto5megapixelresolutionandHDvideos:
LogitechHDCam
Wecantakealookatsomeofthe3Dvisionsensorsavailableonthemarket.SomeofthepopularsensorsareKinect,AsusXtionPro,andCarmine.
![Page 175: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/175.jpg)
Kinect
Kinectisa3DvisionsensorusedalongwiththeMicrosoftXbox360gameconsole.ItmainlycontainsanRGBcamera,aninfraredprojector,adepthsensor,amicrophonearray,andamotorfortilt.TheRGBanddepthcameracaptureimagesataresolutionof640x480at30Hz.TheRGBcameracaptures2Dcolorimages,whereasthedepthcameracapturesmonochromedepthimages.Kinecthasadepthsensingrangefrom0.8mto4m.
SomeoftheapplicationsofKinectare3Dmotioncapture,skeletontracking,facerecognition,andvoicerecognition.
KinectcanbeinterfacedtoPCusingtheUSB2.0interfaceandprogrammedusingKinectSDK,OpenNI,andOpenCV.KinectSDKisonlyavailableforWindowsplatformsandisdevelopedandsuppliedbyMicrosoft.Theothertwolibrariesareopensourceandavailableforallplatforms.TheKinectweareusinghereisthefirstversion;thelatestversionsofKinectonlysupportKinectSDKrunningonWindows.
AsusXtionPro
![Page 176: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/176.jpg)
AsusXtionProisa3DsensordesignedforPC-basedmotionsensingapplications.XtionProisonlyfor3Dsensinganditdoesn'thaveanysoundsensingfacilities.IthasaninfraredprojectorandamonochromeCMOSsensortocapturetheinfrareddata.XtionProcommunicatestothePCviatheUSB2.0interface.XtioncanbepoweredfromtheUSBitselfandcancalculateasensedepthfrom0.8mto3.5mfromthesensor.
TheapplicationsofKinectandXtionProarethesameexceptforvoicerecognition.ItwillworkinWindows,Linux,andMac.WecandevelopapplicationsinXtionProusingOpenNIandOpenCV.
PrimeSenseCarmine
ThePrimeSenseteamdevelopedtheMicrosoftKinect3Dvisionsystem.Later,theydevelopedtheirown3DvisionsensorcalledCarmine.ThetechnologybehindCarmineissimilartoKinect.ItworkswithanIRprojectorandadepthimageCMOSsensor.ThefollowingfigureshowstheblockdiagramofCarmine:
Carmineblockdiagram
SimilartoKinect,CarminehasanRGBCMOSsensor,adepthimageCMOS,andanIRlightsource.Italsohasanarrayofmicrophonesforvoicerecognition.AllsensorsareinterfacedinSystemOn
![Page 177: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/177.jpg)
Chip(SOC).InterfacingandpoweringisperformedthroughUSB.
CarminecancaptureRGBanddepthframesin640x480resolutionandcansensedepthfrom0.35mto3.5m.ComparedtoKinect,theadvantagesaresmallpowerconsumption,smallformfactor,andgooddepthsensingrange.
CarminecanbeinterfacedtoaPCanditwillsupportWindows,Linux,Mac,andAndroidplatforms.CarmineissupportedbyOpenNI;developerscanprogramthedeviceusingOpenNIanditswrapperlibraries.
AppleIncboughtPrimeSenseinNovember2013.YoucanbuyCarmineatthefollowinglink:
http://www.amazon.com/dp/B00KO908MM?psc=1
![Page 178: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/178.jpg)
IntroductiontoOpenCV,OpenNI,andPCLLet'sdiscussaboutthesoftwareframeworksandlibrariesthatweareusinginourrobots.First,wecandiscussOpenCV.Thisisoneofthelibrariesthatwearegoingtouseinthisrobotforobjectdetectionandotherimageprocessingfunctionalities.
WhatisOpenCV?OpenCVisanopensourceBSD-licensedcomputervisionbasedlibrarythatincludeshundredsofcomputervisionalgorithms.Thelibrary,mainlyaimedforreal-timecomputervision,wasdevelopedbyIntelRussiaresearch,andisnowactivelysupportedbyItseez(http://itseez.com/).
OpenCVlogo
OpenCViswrittenmainlyinCandC++anditsprimaryinterfaceisinC++.ItalsohasgoodinterfacesinPython,Java,Matlab/OctaveandwrappersinotherlanguagessuchasC#andRuby.
InthenewversionofOpenCV,thereissupportforCUDAandOpenCLtogetGPUacceleration(http://www.nvidia.com/object/cuda_home_new.html).
OpenCVwillrunonmostoftheOSplatforms(suchasWindows,Linux,MacOSX,Android,FreeBSD,OpenBSD,iOS,andBlackberry).
InUbuntu,OpenCV,andPython,wrappersarealreadyinstalledwhenweinstalltheros-indigo-desktop-fullpackage.Ifthispackageisnotinstalled,thenwecaninstalltheOpenCVlibrary,ROSinterface,andPythoninterfaceofOpenCVusingthefollowingcommand:
$sudoapt-getinstallros-indigo-vision-opencv
IfyouwanttoinstallonlytheOpenCVPythonwrapper,thenusethefollowingcommand:
$sudoapt-getinstallpython-opencv
![Page 179: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/179.jpg)
IfyouwanttotryOpenCVinWindows,youcantrythefollowinglink:
http://docs.opencv.org/doc/tutorials/introduction/windows_install/windows_install.html
ThefollowinglinkwillguideyouthroughtheinstallationprocessofOpenCVonMacOSX:
http://jjyap.wordpress.com/2014/05/24/installing-opencv-2-4-9-on-mac-osx-with-python-support/
ThemainapplicationsofOpenCVareinthefieldof:
ObjectdetectionGesturerecognitionHuman-computerinteractionMobileroboticsMotiontrackingFacialrecognition
NowwecanseehowtoinstallOpenCVinUbuntu14.04.2fromsourcecode.
InstallationofOpenCVfromsourcecodeinUbuntu14.04.2
WecaninstallOpenCVfromsourcecodeinLinuxbasedonthefollowingdocumentationofOpenCV:
http://docs.opencv.org/doc/tutorials/introduction/linux_install/linux_install.html
AftertheinstallationofOpenCV,wecantrysomeexamplesusingthePythonwrappersofOpenCV.
ReadinganddisplayinganimageusingthePython-OpenCVinterface
Thefirstexamplewillloadanimageingrayscaleanddisplayitonscreen.
Inthefollowingsectionofcode,wewillimportthenumpymoduleforimagearraymanipulationandthecv2moduleistheOpenCVwrapperforPythoninwhichwecanaccessOpenCVPythonAPIs.NumPyisanextensiontothePythonprogramminglanguage,addingsupportforlargemultidimensionalarraysandmatricesalongwithalargelibraryofhigh-levelmathematicalfunctionstooperateonthesearrays(https://pypi.python.org/pypi/numpy):
#!/usr/bin/envpython
importnumpyasnp
importcv2
Thefollowingfunctionwillreadtherobot.jpgimageandloadthisimageingrayscale.Thefirstargumentofthecv2.imread()functionisthenameoftheimageandthenextargumentisaflagthatspecifiesthecolortypeoftheloadedimage.Iftheflagis>0,theimagereturnsathreechannelRGBcolorimage,iftheflag=0,theloadedimagewillbeagrayscaleimage,andiftheflagis<0,itwillreturnthesameimageasloaded:
img=cv2.imread('robot.jpg',0)
![Page 180: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/180.jpg)
Thefollowingsectionofcodewillshowthereadimageusingtheimshow()function.Thecv2.waitKey(0)functionisakeyboardbindingfunction.Itsargumentistimeinmilliseconds.Ifit's0,itwillwaitindefinitelyforakeystroke:
cv2.imshow('image',img)
cv2.waitKey(0)
Thecv2.destroyAllWindows()functionsimplydestroysallthewindowswecreated:
cv2.destroyAllWindows()
Savetheprecedingcodewithanamecalledimage_read.pyandcopyaJPGfilewithrobot.jpgasitsname.Executethecodeusingthefollowingcommand:
$pythonimage_read.py
Theoutputwillloadanimageingrayscalebecauseweused0asthevalueintheimread()function:
Thefollowingexamplewilltrytoopenwebcam.Theprogramwillquitwhentheuserpressesanybutton.
Capturingfromwebcamera
Thefollowingcodewillcapturethewebcamhavingdevicename/dev/video0or/dev/video1.
WeneedtoimportthefollowingmodulesifweareusingOpenCVAPI's:
#!/usr/bin/envpython
importnumpyasnp
importcv2
ThefollowingfunctionwillcreateaVideoCaptureobject.TheVideoCaptureclassisusedtocapture
![Page 181: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/181.jpg)
videosfromvideofilesorcameras.TheinitializationargumentsoftheVideoCaptureclassistheindexofacameraoranameofavideofile.Deviceindexisjustanumbertospecifythecamera.Thefirstcameraindexis0havingdevicename/dev/video0;that'swhyweuse0here:
cap=cv2.VideoCapture(0)
ThefollowingsectionofcodeisloopedtoreadimageframesfromtheVideoCaptureobjectandshowseachframe.Itwillquitwhenanykeyispressed:
while(True):
#Captureframe-by-frame
ret,frame=cap.read()
#Displaytheresultingframe
cv2.imshow('frame',frame)
ifcv2.waitKey(10):
break
Thefollowingisascreenshotoftheprogramoutput:
YoucanexploremoreOpenCV-Pythontutorialsfromthefollowinglink:
http://opencv-python-tutroals.readthedocs.org/en/latest/py_tutorials/py_tutorials.html
Inthenextsection,wewilllookatOpenNIlibraryanditsapplication.
![Page 182: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/182.jpg)
WhatisOpenNIOpenNIisaMultilanguage,cross-platformframeworkthatdefinesAPI'stowriteapplicationsusingNaturalinteraction(NI).Naturalinteractionisdefinedintermsofexperience.Itmeans,peoplenaturallycommunicatethroughgestures,expressions,movements,anddiscovertheworldbylookingaroundandmanipulatingphysicalstuff.
OpenNIAPI'sarecomposedofasetofinterfacestowriteNIapplications.Thefollowingfigureshowsathree-layeredviewoftheOpenNIlibrary:
Thetoplayerrepresentstheapplicationlayerthatimplementsnaturalinteraction-basedapplication.ThemiddlelayeristheOpenNIlayeranditwillprovidecommunicationinterfacesthatinteractwithsensorsandmiddlewarecomponentsthatanalyzethedatafromthesensor.Middlewarecanbeusedforfullbodyanalysis,handpointanalysis,gesturedetection,andsoon.OneoftheexamplesofmiddlelayerisNITE,whichcandetectgestureandskeleton.
Thebottomlayershowsthehardwaredevicesthatcapturevisualsandaudioelementsofthescene.Itincludes3Dsensors,RGBcameras,anIRcamera,andamicrophone.
OpenNIiscross-platformandhasbeensuccessfullycompiledanddeployedonLinux,MacOSX,andWindows.
Inthenextsection,wewillseehowwetoinstallOpenNIinUbuntu14.04.2.
InstallingOpenNIinUbuntu14.04.2
WecaninstalltheOpenNIlibraryalongwithROSpackages.ROSisalreadyinterfacedwithOpenNI,butthecompleteinstallationofros-indigo-desktop-fullmaynotinstallOpenNIpackages;weneedtoinstallitfromthepackagemanager.
![Page 183: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/183.jpg)
Thefollowingistheinstallationcommand:
$sudoapt-getinstallros-indigo-openni-launch
ThesourcecodeandlatestbuildofOpenNIforWindows,Linux,andMacOSXisavailableatthefollowinglink:
http://structure.io/openni
Inthenextsection,wewillseehowtoinstallPCL.
WhatisPCL?PCLisalargescale,openprojectfor2D/3Dimage,andPointCloudprocessing.ThePCLframeworkcontainsnumerousalgorithmsincludedtoperformfiltering,featureestimation,surfacereconstruction,registration,modelfitting,andsegmentation.Usingthesemethods,wecanprocessPointCloudandextractkeydescriptorstorecognizeobjectsintheworldbasedontheirgeometricappearanceandcreatesurfacesfromthePointCloudsandvisualizethem.
PCLlogo
PCLisreleasedundertheBSDlicense.It'sopensource,andfreeforcommercial,orresearchuse.PCLiscross-platformandhasbeensuccessfullycompiledanddeployedonLinux,MacOSX,Windows,andAndroid/iOS.
YoucandownloadPCLatthefollowinglink:
http://pointclouds.org/downloads/
PCLisalreadyintegratedintoROS.ThePCLlibraryanditsROSinterfacewillinstallalongwithROSfulldesktopinstallation.Inthepreviouschapter,wediscussedhowtoinstallROSfulldesktopinstallation.PCListhe3DprocessingbackboneofROS.RefertothefollowinglinkfordetailsontheROS-PCLpackage:
http://wiki.ros.org/pcl.
![Page 184: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/184.jpg)
ProgrammingKinectwithPythonusingROS,OpenCV,andOpenNILet'slookathowwecaninterfaceandworkwiththeKinectsensorinROS.ROSisbundledwithOpenNIdriver,whichcanfetchRGBandthedepthimageofKinect.ThispackagecanbeusedforMicrosoftKinect,PrimeSenseCarmine,AsusXtionPro,andProLive.
Thisdrivermainlypublishesrawdepth,RGB,andIRimagestreams.Theopenni_launchpackagewillinstallpackagessuchasopenni_cameraandopenni_launch.Theopenni_camerapackageistheKinectdriverthatpublishesrawdataandsensorinformation,whereastheopenni_launchpackagecontainsROSlaunchfiles.It'sbasicallyanXMLfilethatlaunchesmultiplenodesatatimeandpublishesdatasuchaspointclouds.
HowtolaunchOpenNIdriverThefollowingcommandwillopentheOpenNIdeviceandloadallnodeletstoconvertrawdepth/RGB/IRstreamstodepthimages,disparityimages,andpointclouds.TheROSnodeletpackageisdesignedtoprovideawaytorunmultiplealgorithmsinthesameprocesswithzerocopytransportbetweenalgorithms.
$roslaunchopenni_launchopenni.launch
YoucanviewtheRGBimageusingaROStoolcalledimage_view
$rosrunimage_viewimage_viewimage:=/camera/rgb/image_color
Inthenextsection,wewillseehowtointerfacetheseimagestoOpenCVforimageprocessing.
TheROSinterfaceofOpenCVROSisintegratedintomanylibraries.OpenCVisalsointegratedintoROSmainlyforimageprocessing.Thevision_opencvROSstackincludesthecompleteOpenCVlibraryandinterfacetoROS.
Thevision_opencvprovidesseveralpackages:
cv_bridge:ThiscontainstheCvBridgeclass;thisclassconvertsfromROSimagemessagestoOpenCVimagedatatypeandviceversaimage_geometry:Thiscontainsacollectionofmethodstohandleimageandpixelgeometry
ThefollowingdiagramshowshowOpenCVisinterfacedtoROS:
![Page 185: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/185.jpg)
OpenCV-ROSinterfacing
TheimagedatatypeofOpenCVareIplImageandMat.IfwewanttoworkwithOpenCVinROS,wehavetoconvertIplImageorMattoROSImagemessages.TheROSpackagevision_opencvhastheCvBridgeclass;thisclasscanconvertIplImagetoROSimageandviceversa.
ThefollowingsectionshowshowtocreateaROSpackage;thispackagecontainsnodetosubscribeRGB,depthimage,processtheRGBimagetodetectedges,anddisplayallimagesafterconvertingtoanimagetypeequivalenttoOpenCV.
CreatingROSpackagewithOpenCVsupport
Wecancreateapackagecalledsample_opencv_pkgwiththefollowingdependencies,thatis,sensor_msgs,cv_bridge,rospy,andstd_msgs.Thesensor_msgsdependencydefinesmessagesforcommonlyusedsensors,includingcamerasandscanninglaserrangefinders;cv_bridgeistheOpenCVinterfaceofROS.
ThefollowingcommandwillcreatetheROSpackagewiththeprecedingdependencies:
$catkin-create-pkgsample_opencv_pkgsensor_msgscv_bridgerospystd_msgs
Aftercreatingthepackage,createascriptsfolderinsidethepackageandsavethecodeinthementionedinthenextsection.
DisplayingKinectimagesusingPython,ROS,andcv_bridge
ThefirstsectionofthePythoncodeisgivenbelow.Itmainlyincludesimportingofrospy,sys,cv2,
![Page 186: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/186.jpg)
sensor_msgs,cv_bridge,andthenumpymodule.Thesensor_msgsdependencyimportstheROSdatatypeofImageandCameraInfo.Thecv_bridgemoduleimportstheCvBridgeclassforconvertingROSimagedatatypetotheOpenCVdatatypeandviceversa:
importrospy
importsys
importcv2
importcv2.cvascv
fromsensor_msgs.msgimportImage,CameraInfo
fromcv_bridgeimportCvBridge,CvBridgeError
importnumpyasnp
ThefollowingsectionofcodeisaclassdefinitioninPythontodemonstrateCvBridgefunctions.TheclassisnamedascvBridgeDemo:
classcvBridgeDemo():
def__init__(self):
self.node_name="cv_bridge_demo"
#Initializetherosnode
rospy.init_node(self.node_name)
#Whatwedoduringshutdown
rospy.on_shutdown(self.cleanup)
#CreatetheOpenCVdisplaywindowfortheRGBimage
self.cv_window_name=self.node_name
cv.NamedWindow(self.cv_window_name,cv.CV_WINDOW_NORMAL)
cv.MoveWindow(self.cv_window_name,25,75)
#Andoneforthedepthimage
cv.NamedWindow("DepthImage",cv.CV_WINDOW_NORMAL)
cv.MoveWindow("DepthImage",25,350)
#Createthecv_bridgeobject
self.bridge=CvBridge()
#Subscribetothecameraimageanddepthtopicsandset
#theappropriatecallbacks
self.image_sub=rospy.Subscriber("/camera/rgb/image_color",Image,
self.image_callback)
self.depth_sub=rospy.Subscriber("/camera/depth/image_raw",Image,
self.depth_callback)
rospy.loginfo("Waitingforimagetopics...")
ThefollowingcodegivesacallbackfunctionofthecolorimagefromKinect.Whenacolorimagecomesonthe/camera/rgb/image_colortopic,itwillcallthisfunction.Thisfunctionwillprocessthecolorframeforedgedetectionandshowtheedgedetectedandrawcolorimage:
defimage_callback(self,ros_image):
#Usecv_bridge()toconverttheROSimagetoOpenCVformat
try:
frame=self.bridge.imgmsg_to_cv(ros_image,"bgr8")
exceptCvBridgeError,e:
![Page 187: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/187.jpg)
printe
#ConverttheimagetoaNumpyarraysincemostcv2functions
#requireNumpyarrays.
frame=np.array(frame,dtype=np.uint8)
#Processtheframeusingtheprocess_image()function
display_image=self.process_image(frame)
#Displaytheimage.
cv2.imshow(self.node_name,display_image)
#Processanykeyboardcommands
self.keystroke=cv.WaitKey(5)
if32<=self.keystrokeandself.keystroke<128:
cc=chr(self.keystroke).lower()
ifcc=='q':
#Theuserhaspresstheqkey,soexit
rospy.signal_shutdown("Userhitqkeytoquit.")
ThefollowingcodegivesacallbackfunctionofthedepthimagefromKinect.Whenadepthimagecomesonthe/camera/depth/raw_imagetopic,itwillcallthisfunction.Thisfunctionwillshowtherawdepthimage:
defdepth_callback(self,ros_image):
#Usecv_bridge()toconverttheROSimagetoOpenCVformat
try:
#Thedepthimageisasingle-channelfloat32image
depth_image=self.bridge.imgmsg_to_cv(ros_image,"32FC1")
exceptCvBridgeError,e:
printe
#ConvertthedepthimagetoaNumpyarraysincemostcv2functions
#requireNumpyarrays.
depth_array=np.array(depth_image,dtype=np.float32)
#Normalizethedepthimagetofallbetween0(black)and1(white)
cv2.normalize(depth_array,depth_array,0,1,cv2.NORM_MINMAX)
#Processthedepthimage
depth_display_image=self.process_depth_image(depth_array)
#Displaytheresult
cv2.imshow("DepthImage",depth_display_image)
Thefollowingfunctioniscalledprocess_image(),whichwillconvertthecolorimagetograyscale,thenblurtheimage,andfindtheedgesusingthecannyedgefilter:
defprocess_image(self,frame):
#Converttograyscale
grey=cv2.cvtColor(frame,cv.CV_BGR2GRAY)
#Blurtheimage
grey=cv2.blur(grey,(7,7))
![Page 188: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/188.jpg)
#ComputeedgesusingtheCannyedgefilter
edges=cv2.Canny(grey,15.0,30.0)
returnedges
Thefollowingfunctioniscalledprocess_depth_image().Itsimplyreturnsthedepthframe:
defprocess_depth_image(self,frame):
#Justreturntherawimageforthisdemo
returnframe
Thisfunctionwillclosetheimagewindowwhenthenodeshutsdown:
defcleanup(self):
print"Shuttingdownvisionnode."
cv2.destroyAllWindows()
Thefollowingcodeisthemain()function.ItwillinitializethecvBridgeDemo()classandcalltherosspin()function:
defmain(args):
try:
cvBridgeDemo()
rospy.spin()
exceptKeyboardInterrupt:
print"Shuttingdownvisionnode."
cv.DestroyAllWindows()
if__name__=='__main__':
main(sys.argv)
Savetheprecedingcodetocv_bridge_demo.pyandchangethepermissionofthenodeusingthefollowingcommand.Thenodeisonlyvisibletotherosruncommandifwegiveitexecutablepermission.
$chmod+Xcv_bridge_demo.py
Thefollowingarethecommandstostartthedriverandnode.StarttheKinectdriverusingthefollowingcommand:
$roslaunchopenni_launchopenni.launch
Runthenodeusingthefollowingcommand:
$rosrunsample_opencv_pkgcv_bridge_demo.py
Thefollowingisthescreenshotoftheoutput:
![Page 189: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/189.jpg)
RGB,depth,andedgeimages
![Page 190: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/190.jpg)
WorkingwithPointCloudsusingKinect,ROS,OpenNI,andPCLAPointCloudisadatastructureusedtorepresentacollectionofmultidimensionalpointsandiscommonlyusedtorepresent3Ddata.Ina3DPointCloud,thepointsusuallyrepresentthex,y,andzgeometriccoordinatesofanunderlyingsampledsurface.Whenthecolorinformationispresent,thePointCloudbecomes4D.
PointCloudscanbeacquiredfromhardwaresensors(suchasstereocameras,3Dscanners,ortime-of-flightcameras).Itcanbegeneratedfromacomputerprogramsynthetically.PCLsupportstheOpenNI3Dinterfacesnatively;thusitcanacquireandprocessdatafromdevices(suchasPrimeSensor's3Dcameras,MicrosoftKinect,orAsusXTionPRO).
PCLwillbeinstalledalongwiththeROSindigofulldesktopinstallation.Let'sseehowwecangenerateandvisualizePointCloudinRViz,adatavisualizationtoolinROS.
OpeningdeviceandPointCloudgenerationOpenanewterminalandlaunchtheROSOpenNIdriveralongwiththePointCloudgeneratornodesusingthefollowingcommand:
roslaunchopenni_launchopenni.launch
ThiscommandwillactivatetheKinectdriverandprocesstherawdataintoconvenientoutputslikePointCloud.
WewilluseRViz3DvisualizationtooltoviewPointClouds.
ThefollowingcommandwillstarttheRViztool:
$rosrunrvizrviz
SettheRVizoptionsforFixedFrame(atthetopoftheDisplayspanelunderGlobalOptions)tocamera_link.
Ontheleft-handsideoftheRVizpanel,clickontheAddbuttonandchoosethePointCloud2displayoption.Setitstopicto/camera/depth/points.
ChangeColorTransformerofPointCloud2toAxisColor.
ThefollowingfigureshowsascreenshotofRVizPointClouddata.Inthisscreenshot,thenearobjectismarkedinredandthefarobjectismarkedinvioletandblue.TheobjectinfrontofKinectisrepresentedascylinderandcube:
![Page 191: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/191.jpg)
PointCloudofarobot
![Page 192: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/192.jpg)
ConversionofPointCloudtolaserscandataWeareusingKinectinthisrobotforreplicatingthefunctionofexpensivelaserrangescanner.KinectcandeliverPointClouddatawhichcontainsthedepthofeachpointofsurrounding.ThePointClouddataisprocessedandconvertedtodataequivalenttoalaserscannerusingtheROSdepthimage_to_laserscanpackage.ThemainfunctionofthispackageistosliceasectionofthePointClouddataandconvertittoalaserscanequivalentdatatype.ThePointcloud2datatypeissensor_msgs/PointCloud2andforthelaserscanner,thedatatypeissensor_msgs/LaserScan.Thispackagewillperformthisprocessingandfakethelaserscanner.ThelaserscanneroutputcanbeviewedusingRViz.Inordertoruntheconversion,wehavetostarttheconvertornodeletsthatwillperformthisoperation.Wehavetospecifythisinourlaunchfiletostarttheconversion.Thefollowingistherequiredcodeinthelaunchfiletostartthedepthimage_to_laserscanconversion:
<!--Fakelaser-->
<nodepkg="nodelet"type="nodelet"name="laserscan_nodelet_manager"
args="manager"/>
<nodepkg="nodelet"type="nodelet"name="depthimage_to_laserscan"
args="loaddepthimage_to_laserscan/DepthImageToLaserScanNodelet
laserscan_nodelet_manager">
<paramname="scan_height"value="10"/>
<paramname="output_frame_id"value="/camera_depth_frame"/>
<paramname="range_min"value="0.45"/>
<remapfrom="image"to="/camera/depth/image_raw"/>
<remapfrom="scan"to="/scan"/>
</node>
Alongwithstartingthenodelet,weneedtosetcertainparametersofthenodeletforbetterconversion.Refertohttp://wiki.ros.org/depthimage_to_laserscanforadetailedexplanationofeachparameter.
Thelaserscanoftheprecedingviewisshowninthefollowingscreenshot.Toviewthelaserscan,addtheLaserScanoption.ThisissimilartohowweaddthePointCloud2optionandchangetheTopicvalueofLaserScanto/scan:
![Page 193: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/193.jpg)
![Page 194: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/194.jpg)
WorkingwithSLAMusingROSandKinectThemainaimofdeployingvisionsensorsinourrobotistodetectobjectsandperformrobotnavigationinanenvironment.SLAMisatechniqueusedinmobilerobotsandvehiclestobuildupamapofanunknownenvironmentorupdateamapwithinaknownenvironmentbytrackingthecurrentlocationofarobot.
Mapsareusedtoplantherobottrajectoryandtonavigatethroughthispath.Usingmaps,therobotwillgetanideaabouttheenvironment.Themaintwochallengesinmobilerobotnavigationaremappingandlocalization.
Mappinginvolvesgeneratingaprofileofobstaclesaroundtherobot.Throughmapping,therobotwillunderstandhowtheworldlooks.Localizationistheprocessofestimatingaposeoftherobotrelativetothemapwebuild.
SLAMfetchesdatafromdifferentsensorsandusesittobuildmaps.The2D/3DvisionsensorcanbeusedasaninputtoSLAM.The2Dvisionsensorssuchaslaserrangefindersand3DsensorssuchasKinectaremainlyusedasaninputforaSLAMalgorithm.
ROSisintegratedintoaSLAMlibraryusingOpenSlam(http://openslam.org/gmapping.html).Thegmappingpackageprovideslaser-basedSLAMasanodecalledslam_gmapping.Thiscancreatea2Dmapfromthelaserandposedatacollectedbyamobilerobot.
Thegmappingpackageisavailableathttp://wiki.ros.org/gmapping.
Touseslam_gmapping,youwillneedamobilerobotthatprovidesodometrydataandisequippedwithahorizontallymounted,fixed,laserrangefinder.Theslam_gmappingnodewillattempttotransformeachincomingscanintotheodom(odometry)tfframe.
Theslam_gmappingnodetakesinsensor_msgs/LaserScanmessagesandbuildsamap(nav_msgs/OccupancyGrid).ThemapcanberetrievedviaaROStopicorservice.
Thefollowingcodecanbeusedtomakeamapfromarobotwithalaserpublishingscansonthebase_scantopic:
$rosrungmappingslam_gmappingscan:=base_scan
![Page 195: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/195.jpg)
Questions1. Whatare3Dsensorsandhowaretheydifferentfromordinarycams?2. Whatarethemainfeaturesofaroboticoperatingsystem?3. WhataretheapplicationsofOpenCV,OpenNI,andPCL?4. WhatisSLAM?5. WhatisRGB-DSLAMandhowdoesitwork?
![Page 196: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/196.jpg)
SummaryInthischapter,wesawvisionsensorstobeusedinourrobot.WeusedKinectinourrobotanddiscussedOpenCV,OpenNI,PCLandtheirapplication.WealsodiscussedtheroleofvisionsensorsinrobotnavigationandapopularSLAMtechniqueanditsapplicationusingROS.Inthenextchapter,wewilldiscussspeechprocessingandsynthesistobeusedinthisrobot.
![Page 197: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/197.jpg)
Chapter8.WorkingwithSpeechRecognitionandSynthesisUsingPythonandROSInthischapter,wewillmainlydiscussthefollowingtopics:
Introducingspeechrecognition,synthesis,andvariousspeechprocessingframeworksWorkingwithspeechrecognitionandsynthesisusingPythoninUbuntu/Linux,WindowsandMacOSXWorkingwithspeechrecognitionandsynthesispackagesinROSusingPython
Iftherobotsareabletorecognizeandrespondthewayhumanbeingscommunicate,thentherobot-humaninteractionwillbemuchmoreeasierandeffectivethananyothermethod.However,extractingspeechparameterssuchasmeaning,pitch,duration,andintensityfromhumanspeechisaverytoughtask.Researchersfoundnumerouswaystosolvethisproblem.Now,therearesomealgorithmsthataredoingagoodjobinspeechprocessing.
Inthischapter,wewilldiscusstheapplicationsofspeechrecognitionandsynthesisinourrobotandalsolookatsomeofthelibrariestoperformspeechrecognitionandsynthesis.
Themainobjectiveofspeechsynthesisandrecognitionsysteminthisrobotistomaketherobot-humaninteractioneasier.Ifarobothastheseabilities,itcancommunicatewiththesurroundingpeopleandtheycanaskvariousquestionsaboutthefoodandthecostofeachitem.Thespeechrecognitionandsynthesisfunctionalitycanbeaddedusingtheframeworkthatwewilldiscussinthischapter.
Inthefirstsectionofthischapter,youwilllearnaboutthestepsinvolvedinspeechrecognitionandsynthesis.
UnderstandingspeechrecognitionSpeechrecognitionbasicallymeanstalkingtoacomputerandmakingitrecognizewhatwearesayinginrealtime.Itconvertsnaturalspokenlanguagetodigitalformatthatcanbeunderstoodbyacomputer.Wearemainlydiscussingtheconversionofspeech-to-textprocesshere.Usingthespeechrecognitionsystem,therobotwillrecordthesentenceorwordcommandedbytheuser.Thetextwillbepassedtoanotherprogramandtheprogramwilldecidewhichactionithastoexecute.Wecantakealookattheblockdiagramofthespeechrecognitionsystemthatexplainshowitworks.
BlockdiagramofaspeechrecognitionsystemThefollowingisablockdiagramofatypicalspeechrecognitionsystem.Wecanseeeachblockandunderstandhowaspeechsignalisconvertedtotext:
![Page 198: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/198.jpg)
Speechrecognitionsystemblockdiagram
ThespeechsignalisreceivedthroughamicrophoneandwillbeconvertedtoadigitalformatsuchasPCM(PulseCodeModulation)bythesoundcardinsidethePC.ThisdigitalformatcanbeprocessedbythesoftwareinsidethePC.Inaspeechrecognitionprocess,thefirstphaseistoextractthespeechfeaturesfromthedigitalsoundformat.
Inthespeechrecognitionsystem,thefollowingarethecommoncomponents:
Featureextraction:Inthisprocess,therawdigitalsoundisconvertedtosoundfeaturevectors,whichcarryinformationofsoundsandsuppresstheirrelevantsourcesofsound.Thesoundfeaturevectorscanbemathematicallyrepresentedasavectorwithrespecttotime.Aftergettingthesoundfeaturevectors,theywillbedecodedtotextfromalistofpossiblestringsandselectedaccordingtoitsprobability.Acousticmodel:Thefirststageofdecodingisacousticmodels.Acousticmodelsaretrainedstatisticalmodelsandarecapableofpredictingtheelementaryunitsofspeechcalledphonemesaftergettingthesoundfeaturevectors.PopularacousticmodelinginspeechrecognitionisHMM(HiddenMarkovModels)andanotherhybridapproachistouseartificialneuralnetworks.Lexicon:Alexicon(alsoknownasdictionary)containsthephoneticscriptofwordsthatweuseintrainingtheacousticmodel.Languagemodel:Thisprovidesastructuretothestreamofwordsthatisdetectedaccordingtotheindividualwordprobability.Thelanguagemodelistrainedusinglargeamountsoftrainingtext(whichincludesthetextusedtotraintheacousticmodel).Ithelpstoestimatetheprobabilities
![Page 199: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/199.jpg)
andfindtheappropriateworddetected.Searchalgorithm:Thisisusedtofindthemostprobablesequenceofwordsinthesoundvectorswiththehelpofthelanguagemodelandlexicon.Recognizedwords:Theoutputofthesearchalgorithmisalistofwordsthathasthehighestprobabilityforgivensoundvectors.
SpeechrecognitionlibrariesThefollowingaresomegoodandpopularimplementationsofspeechrecognitionalgorithmsintheformoflibraries.
CMUSphinx/PocketSphinx
SphinxisagroupofspeechrecognitiontoolsdevelopedbyCarnegieMellonUniversity.Theentirelibraryisopensourceanditcomeswithacousticmodelsandsampleapplications.Theacousticmodeltrainerimprovestheaccuracyofdetection.Itallowsyoutocompileitslanguagemodelandprovidesalexiconcalledcmudict.ThecurrentSphinxversionis4.TheSphinxversioncustomizedforanembeddedsystemiscalledPocketSphinx.It'salightweightspeechrecognitionenginethatwillworkondesktopaswellasonmobiledevices.SphinxlibrariesareavailableforWindows,Linux,andMacOSX.
TherearePythonmodulesavailabletohandlePocketSphinxAPIsfromPython.ThefollowingistheofficialwebsiteofCMUSphinx:
http://cmusphinx.sourceforge.net/
Julius
ThisisahighperformanceandcontinuousspeechrecognitionlibrarybasedonHMMandcandetectcontinuousstreamofwordsorN-grams.It'sanopensourcelibrarythatisabletoworkinrealtime.TherearePythonmodulestohandleJuliusfunctionsfromPython.JuliusisavailableinWindows,Linux,andMacOSX.TheofficialwebsiteofJuliusis:
http://julius.sourceforge.jp/en_index.php
WindowsSpeechSDKMicrosoftprovidesaSDKtohandlespeechrecognitionandsynthesisoperation.SDKcontainsAPIstohandlespeech-relatedprocessingthatcanbeembeddedinsidetheMicrosoftapplication.TheSDKonlyworksonWindowsandithassomeportsforPythonlikethePySpeechmodule.ThesespeechAPIsarecomparativelyaccuratethanotheropensourcetools.
SpeechsynthesisSpeechsynthesisistheprocessofconvertingtextdatatospeech.Thefollowingblockdiagramshowstheprocessinvolvedinconvertingtexttospeech:
![Page 200: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/200.jpg)
Blockdiagramofspeechsynthesisprocess
Formoredetails,refertopage6ofSpokenLanguageProcessing,X.Huang,A.Acero,H.-W.Hon,PrenticeHallPTR,publishedin2001.
Letustakealookatthespeechsynthesisstages:
Textanalysis:Intextanalysis,thetexttobeconvertedtospeechwillcheckforthestructureoftext,linguisticanalysis,andtextnormalizationtoconvertnumbersandabbreviationstowordsPhoneticanalysis:Inphoneticanalysis,eachindividualtextdatacalledgraphemeisconvertedtoanindividualindivisiblesequenceofsoundcalledphonemeProsodicanalysis:Inprosodicanalysis,theprosodyofspeech(suchasrhythm,stress,andintonationsofspeech)addedtothebasicsoundmakesitmorerealisticSpeechsynthesis:Thisunitfinallybindstheshortunitsofspeechandproducesthefinalspeechsignal
SpeechsynthesislibrariesLet'snowdiscussabitaboutthevariousspeechsynthesislibraries.
eSpeak
eSpeakisanopensourcelightweightspeechsynthesizermainlyforEnglishlanguageanditwillalsosupportseveralotherlanguages.UsingeSpeak,wecanchangethevoicesanditscharacteristics.eSpeakhasthemoduletoaccessitsAPIsfromPython.eSpeakworksmainlyinWindowsandLinuxandit'salsocompatiblewithMacOSX.TheofficialwebsiteofeSpeakisasfollows:
http://espeak.sourceforge.net/
Festival
![Page 201: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/201.jpg)
FestivalisanopensourceandfreespeechsynthesizerdevelopedbyCentreofSpeechTechnologyResearch(CSTR)andiswrittencompletelyinC++.ItprovideaccesstotheAPIsfromShellintheformofcommandsandalsoinC++,Java,andPython.Ithasmultilanguagesupport(suchasEnglishandSpanish).FestivalmainlysupportsLinux-basedplatform.ThecodecanalsobebuiltinWindowsandMacOSX.ThefollowingistheofficialwebsiteoftheFestivalspeechsynthesissystem:
http://www.cstr.ed.ac.uk/projects/festival/
![Page 202: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/202.jpg)
WorkingwithspeechrecognitionandsynthesisinUbuntu14.04.2usingPythonInthissection,wewilldiscussPythoninterfacingwithPocketSphinx,Julius,andMicrosoftSpeechSDKandspeechsynthesisframeworkssuchaseSpeakandFestival.Let'sstartwithspeechrecognitionlibrariesandtheirinstallationprocedures.
SettingupPocketSphinxanditsPythonbindinginUbuntu14.04.2ThefollowingpackagesarerequiredtoinstallPocketSphinxanditsPythonbindings:
python-pocketsphinx
pocketsphinx-hmm-wsj1
pocketsphinx-lm-wsj
Thepackagescanbeinstalledusingtheapt-getcommand.ThefollowingcommandsareusedtoinstallPocketSphinxanditsPythoninterface.
InstallingPocketSphinxinUbuntucanbedoneeitherthroughsourcecodeorbypackagemanagers.Here,wewillinstallPocketSphinxusingthepackagemanager:
ThefollowingcommandwillinstallHMMofPocketSphinx:
$sudoapt-getinstallpocketsphinx-hmm-wsj1
ThefollowingcommandwillinstallLMofPocketSphinx:
$sudoapt-getinstallpocketsphinx-lm-wsj
ThefollowingcommandwillinstallthePythonextensionofPocketSphinx:
$sudoapt-getinstallpython-pocketsphinx
Oncewearedonewiththeinstallation,wecanworkwithPythonscriptingforspeechrecognition.
WorkingwithPocketSphinxPythonbindinginUbuntu14.04.2ThefollowingisthecodetoperformspeechrecognitionusingPocketSphinxandPython.Thefollowingcodedemonstrateshowwecandecodethespeechrecognitionfromawavefile:
#!/usr/bin/envpython
importsys
#InUbuntu14.04.2,thepocketsphinxmoduleshowserrorinfirstimportandwill
workforthesecondimport.Thefollowingcodeisatemporaryfixtohandlethat
issue
![Page 203: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/203.jpg)
try:
importpocketsphinx
except:
importpocketsphinx
TheprecedingcodewillimportthepocketsphinxPythonmoduleandPythonsysmodule.Thesysmodulecontainfunctionsthatcanbecalledduringprogramruntime.Inthiscode,wewillusethesysmoduletogetthewavefilenamefromthecommand-lineargument:
if__name__=="__main__":
hmdir="/usr/share/pocketsphinx/model/hmm/en_US/hub4wsj_sc_8k"
lmdir="/usr/share/pocketsphinx/model/lm/en_US/hub4.5000.DMP"
dictd="/usr/share/pocketsphinx/model/lm/en_US/cmu07a.dic"
Thehmdir,lmdirn,anddictdvariablesholdthepathofHMM,LM(LanguageModel),anddictionaryofPocketSphinx:
#Receivingwavefilenamefromcommandlineargument
wavfile=sys.argv[1]
ThefollowingcodewillpassHMM,LM,andthedictionarypathofPocketSphinxtoPocketSphinx'sDecoderclass.Readanddecodethewavefile.Intheend,itwillprintthedetectedtext:
speechRec=pocketsphinx.Decoder(hmm=hmdir,lm=lmdir,dict=dictd)
wavFile=file(wavfile,'rb')
speechRec.decode_raw(wavFile)
result=speechRec.get_hyp()
print"\n\n\nDetectedtext:>",result
print"\n\n\n"
OutputTheprecedingcodecanberunusingthefollowingcommand:
$python<code_name.py><wave_file_name.wav>
Thefollowingisascreenshotoftheoutput.Thedetectedtextwasnotthecontentonthewavefile.ThedetectionaccuracywiththedefaultacousticmodelandLMislow;wehavetotrainanewmodeloradaptanexistingmodeltoimproveaccuracy:
![Page 204: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/204.jpg)
Thepreviousmethodwediscussedwasanofflinerecognition;inthenextsection,wewillseehowtoperformreal-timespeechrecognitionusingPocketSphinx,GStreamer,andPython.Inthisapproach,real-timespeechdatacomesthroughtheGStreamerframeworkandisdecodedusingPocketSphinx.ToworkwiththeGStreamerPocketSphinxinterface,installthefollowingpackages:
ThefollowingcommandwillinstalltheGStreamerpluginforPocketSphinx:
$sudoapt-getinstallgstreamer0.10-pocketsphinx
ThefollowingpackagewillinstalltheGStreamerPythonbinding.ItwillenableyoutouseGStreamerAPIsfromPython:
$sudoapt-getinstallpython-gst0.10
ThefollowingpackagewillinstalltheGStreamerplugintogetinformationfromGConf:
$sudoapt-getinstallgstreamer0.10-gconf
![Page 205: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/205.jpg)
Real-timespeechrecognitionusingPocketSphinx,GStreamer,andPythoninUbuntu14.04.2Thefollowingisthecodeforreal-timespeechrecognitionusingGStreamer:
#!/usr/bin/envpython
#ThefollowingmodulesneedtoimportbeforehandlinggstreamerAPI's
importgobject
importsys
importpygst
pygst.require('0.10')
gobject.threads_init()
importgst
#Moduletohandlekeyboardinterruptsignal
importsignal
#Keyboardsignalhandlingroutine
defsignal_handle(signal,frame):
print"YoupressedCtrl+C"
sys.exit(0)
#ImplementationofSpeechrecognitionclass
classSpeech_Recog(object):
#Initializinggstreamerpipelineandpocketsphinxelement
def__init__(self):
self.init_gst()
#Thisfunctionwillinitializegstreamerpipeline
definit_gst(self):
#Thefollowingcodecreateagstreamerpipelinewithpipelinedescription.The
requireddescriptorsneededforthecodeisgivenasparameters.
self.pipeline=gst.parse_launch('gconfaudiosrc!audioconvert!
audioresample'
+'!vadername=vadauto-threshold=true'
+'!pocketsphinxname=asr!fakesink')
#Accessingpocketsphinxelementfromgstreamerpipeline
asr=self.pipeline.get_by_name('asr')
#Connectingtoasr_resultfunctionwhenaspeechtotextconversionis
completed
asr.connect('result',self.asr_result)
#Usercanmentionlmanddictforaccuratedetection
#asr.set_property('lm','/home/user/mylanguagemodel.lm')
#asr.set_property('dict','/home/user/mylanguagemodel.dic')
#Thisoptionwillsetalloptionsareconfiguredwellandcanstartrecognition
![Page 206: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/206.jpg)
asr.set_property('configured',True)
#PausingtheGStreamerpipelineatfirst.
self.pipeline.set_state(gst.STATE_PAUSED)
#Definitionofasr_result
defasr_result(self,asr,text,uttid):
#Printingthedetectedtext
print"DetectedText=>",text
#Thisfunctionwillstart/stopSpeechrecognitionoperation
defstart_recognition(self):
#VADER-VoiceActivityDEtectoR,whichhelpswhenthespeechstartandwhen
itsends.CreatingVADERobjectandsetthepropertysilenttoFalse,sonospeech
willdetecteduntilkeypress
vader=self.pipeline.get_by_name('vad')
vader.set_property('silent',False)
#Waitingforakeypresstostartrecognition
raw_input("Pressanykeytostartrecognition:>")
#Startplayingthepipeline
self.pipeline.set_state(gst.STATE_PLAYING)
#Waitingforstoppingtherecognition
raw_input("Pressanykeytostoprecognition:>")
vader=self.pipeline.get_by_name('vad')
#SettingsilentpropertyofVADERtoTrue
vader.set_property('silent',True)
#PausingGStreamerpipeline
self.pipeline.set_state(gst.STATE_PAUSED)
if__name__=="__main__":
#CreatinganobjectofSpeech_Recog()class
app_object=Speech_Recog()
#Assignkeyboardinterrupthandler
signal.signal(signal.SIGINT,signal_handle)
whileTrue:
#CallingSpeechrecognitionroutine
app_object.start_recognition()
Thecodecanbesimplyexecutedusingthefollowingcommand:
$python<code_name.py>
Thefollowingisthescreenshotoftheoutputwindow:
![Page 207: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/207.jpg)
Pressanykeytostartrecognition;afterthis,wecantalkanditwillbeconvertedandprintedontheterminalwindow.Tostopdetection,pressanykeyanditwillpausetheGStreamerpipeline.
OneoftheotherspeechrecognitiontoolisJulius.WewillseehowtoinstallitandworkwithitusingPython.
![Page 208: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/208.jpg)
SpeechrecognitionusingJuliusandPythoninUbuntu14.04.2Inthissection,wewillseehowtoinstallthespeechrecognitionsystemofJuliusandhowtoconnectittoPython.Therequiredpackages(suchasJuliusandaudiotools)areavailableinUbuntu'spackagemanager,butwealsoneedtodownloadandinstallthePythonwrapperseparately.Let'sstartwiththerequiredcomponentsfortheinstallation.
InstallationofJuliusspeechrecognizerandPythonmoduleThefollowingaretheinstructionstoinstallJuliusandPythonbindinginUbuntu14.04.2:
ThefollowingcommandwillinstallthespeechrecognitionsystemofJulius:
$sudoapt-getinstalljulius
Thefollowingcommandwillinstallpadsp(thepulseaudiotool).ItmaybenecessarytoruntheJuliusspeechrecognizerinUbuntu14.04.2:
$sudoapt-getinstallpulseaudio-utils
ThefollowingcommandwillinstalltheOSSproxydaemontoemulatetheOSSsounddeviceandstreamthroughtheALSAdevice.Itwillemulatethe/dev/dspdeviceinUbuntuandstreamthroughALSA.Juliusneedsthe/dev/dspdeviceforitsfunctioning:
$sudoapt-getinstallosspd-alsa
ReloadtheALSAprocesstobindosspdtoalsa:
$sudoalsaforce-reload
Toinstallpyjulius,thePythonextensionforJulius,youneedtoinstallthesetuptoolsinPython.
1. Toinstallthesetuptools,thebestoptionistodownloadascriptfromthesetuptoolswebsite;it'saglobalscriptthatcanbeusedinanyOS.Thescriptcanbedownloadedfromthefollowinglinkusingthewgettool:
$wgethttps://bootstrap.pypa.io/ez_setup.py
$sudopythonez_setup.py
2. Theinstallationdetailsofsetuptoolsismentionedathttps://pypi.python.org/pypi/setuptools3. Aftertheinstallationofsetuptools,downloadpyjuliusfrom
https://pypi.python.org/pypi/pyjulius/0.34. Extracttheachieveandinstallationpackageusingthefollowingcommand:
$sudopythonsetup.pyinstall
5. Aftertheinstallationofpyjulius,installademooftheJuliustool,whichcontainsHMM,LM,anddictionaryofafewwords.DownloadtheJuliusquick-startfilesusingthefollowingcommand:
![Page 209: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/209.jpg)
$wgethttp://www.repository.voxforge1.org/downloads/software/julius-3.5.2-
quickstart-linux.tgz
6. Extractthefilesandrunthecommandfromthefolder.7. Executethefollowingcommandintheextractedfolder.Itwillstartthespeechrecognitioninthe
commandline:
$padspjulius-inputmic-Cjulian.jconf
8. Toexitspeechrecognition,clickonCTRL+C.9. ToconnecttoPython,enterthefollowingcommand:
$padspjulius-module-inputmic-Cjulian.jconf
ThiscommandwillstartaJuliusserver.Thisserverlistenstoclients.IfwewanttouseJuliusAPIsfromPython,weneedtoconnecttoaserverusingaclientcodeasgiveninthefollowingsections.ThePythoncodeisaclientthatconnectstotheJuliusserverandprintstherecognizedtext.
Python-JuliusclientcodeThefollowingcodeisaPythonclientoftheJuliusspeechrecognitionserverthatwestartedusingthepreviouscommand.Afterconnectingtothisserver,itwilltriggerspeech-to-textconversionandfetchtheconvertedtextandprintonterminal:
#!/usr/bin/envpython
importsys
#Importingpujuliusmodule
importpyjulius
#ItanimplementationofFIFO(FirstInFirstOut)queuesuitableformultithreaded
programming.
importQueue
#InitializeJuliusClientobjectwithlocalhostipanddefaultportof10500and
tryingtoconnectserver.
client=pyjulius.Client('localhost',10500)
try:
client.connect()
#Whentheclientrunsbeforeexecutingtheserveritwillcauseaconnectionerror.
exceptpyjulius.ConnectionError:
print'Startjuliusasmodulefirst!'
sys.exit(1)
#Startlisteningtotheserver
client.start()
try:
while1:
try:
#Fetchingrecognitionresultfromserver
result=client.results.get(False)
exceptQueue.Empty:
continue
![Page 210: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/210.jpg)
printresult
exceptKeyboardInterrupt:
print'Exiting...'
client.stop()#sendthestopsignal
client.join()#waitforthethreadtodie
client.disconnect()#disconnectfromjulius
AfterconnectingtoJuliusserver,thePythonclientwilllistentoserverandprinttheoutputfromtheserver.
Theacousticmodelsweusedintheprecedingprogramsarealreadytrained,buttheymaynotgiveaccurateresultsforourspeech.Toimprovetheaccuracyinthepreviousspeechrecognitionengines,weneedtotrainnewlanguageandacousticmodelsandcreateadictionaryorwecanadapttheexistinglanguagemodelusingourvoice.Themethodtoimproveaccuracyisbeyondthescopeofthischapter,sosomelinkstotrainoradaptbothPocketSphinxandJuliusaregiven.
ImprovingspeechrecognitionaccuracyinPocketSphinxandJuliusThefollowinglinkisusedtoadapttheexistingacousticmodeltoourvoiceforPocketSphinx:
http://cmusphinx.sourceforge.net/wiki/tutorialadapt
Juliusaccuracycanbeimprovedbywritingrecognitiongrammar.ThefollowinglinkgivesanideaabouthowtowriterecognitiongrammarinJulius:
http://julius.sourceforge.jp/en_index.php?q=en_grammar.html
Inthenextsection,wewillseehowtoconnectPythonandspeechsynthesislibraries.WewillworkwiththeeSpeakandFestivallibrarieshere.Thesearetwopopular,free,andeffectivespeechsynthesizersavailableinalltheOSplatforms.ThereareprecompiledbinariesavailableinUbuntuintheformofpackages.
SettingupeSpeakandFestivalinUbuntu14.04.2eSpeakandFestivalarespeechsynthesizersavailableintheUbuntu/Linuxplatform.TheseapplicationscanbeinstalledfromthesoftwarepackagerepositoryofUbuntu.ThefollowingaretheinstructionsandcommandstoinstallthesepackagesinUbuntu.
1. ThefollowingcommandswillinstalltheeSpeakapplicationanditswrapperforPython.WecanusethiswrapperinourprogramandaccesseSpeakAPIs:
$sudoapt-getinstallespeak
$sudoapt-getinstallpython-espeak
2. ThefollowingcommandwillinstalltheFestivaltext-to-speechengine.Festivalhassomepackagedependencies;alldependencieswillbeautomaticallyinstalledusingthiscommand:
$sudoapt-getinstallfestival
![Page 211: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/211.jpg)
3. AftertheinstallationoftheFestivalapplication,wecandownloadandinstallPythonbindingsforFestival.
4. DownloadPythonbindingsusingthefollowingcommand.Weneedthesvntool(ApacheSubversion)todownloadthispackage.Subversionisafreesoftwareversioningandrevisioncontrolsystem:
$svncheckouthttp://pyfestival.googlecode.com/svn/trunk/pyfestival-read-only
5. Afterthedownloadingprocessiscomplete,switchtothepyfestival-read-onlyfolderandyoucaninstallthispackageusingthefollowingcommand:
$sudopythonsetup.pyinstall
HereisthecodetoworkwithPythonandeSpeak.Asyouwillsee,it'sveryeasytoworkwithPythonbindingforeSpeak.WeneedtowriteonlytwolinesofcodetosynthesizespeechusingPython:
fromespeakimportespeak
espeak.synth("HelloWorld")
ThiscodewillimporttheeSpeak-Pythonwrappermoduleandcallthesynthfunctioninthewrappermodule.Thesynthfunctionwillsynthesizethetextgivenasargument.
ThefollowingcodeshowshowtosynthesizespeechusingPythonandFestival:
importfestival
festival.say("HelloWorld")
TheprecedingcodewillimporttheFestival-PythonwrappermoduleandcallthesayfunctionintheFestivalmodule.Itwillsynthesizethetextasspeech.
![Page 212: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/212.jpg)
WorkingwithspeechrecognitionandsynthesisinWindowsusingPythonInWindows,therearemanytoolsandframeworkstoperformspeechrecognitionandsynthesis.Thespeechrecognitionlibraries,namely,PocketSphinxandJuliusthatwediscussedwillalsobesupportedinWindows.MicrosoftalsoprovidesSAPI(SpeechApplicationProgrammingInterface),asetofAPIsthatallowsyoutousespeechrecognitionandsynthesisfromcode.TheseAPIsareeithershippedwithanoperatingsystemorwithMicrosoftSpeechSDK.
Inthissection,wewilldemonstratehowtoconnectPythonandMicrosoftSpeechSDKtoperformspeechrecognitionandsynthesis.ThisprocedurewillworkinWindows8,Windows7,32,and64bit.
InstallationoftheSpeechSDKThefollowingisthestep-by-stepproceduretoinstallSpeechSDKandthePythonwrapperofSpeechSDK:
1. DownloadSpeechSDKfromhttp://www.microsoft.com/en-in/download/details.aspx?id=272262. DownloadandinstallActiveStatePython2.7bitfrom
http://www.activestate.com/activepython/downloads3. DownloadandinstallPythonwrapperfortheWindowsSpeechSDKfrom
https://pypi.python.org/pypi/speech/.Currently,thisprojectisnotactive,butitwillworkfineinWindows7and8
4. Installthepackageusingthefollowingcommand:
pythonsetup.pyinstall
5. ThecodetodospeechrecognitionusingPythonisverysimpleandit'sgiveninthefollowingcode:
importspeech
result=speech.input("Speak")
printresult
Inthiscode,weimportthespeechmodule.Whenweimportthespeechmodule,thespeechrecognitionpanelofWindowswillpopup.Itwillbeinoffstatefirstandweneedtoturnitontoperformtherecognition.TherecognizedtextwillbeprintedonthePythoncommandline.
Next,wecanlookatthespeechsynthesisusingPython.Similartothespeechrecognitionprocess,it'sveryeasytoperformthespeechmodule.Thiscanbeusedtoperformspeechsynthesis.Hereisthecode:
importspeech
speech.say("HelloWorld")
Inthiscode,speech.say()isthemethodtoconverttexttospeech.
![Page 213: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/213.jpg)
Wehaveseensomeofthespeechrecognitionandsynthesisplatforms.Now,wecantakealookathowwecanintegratespeechrecognitionandsynthesisinROS.ThefollowingsectiondiscussestheintegrationofspeechrecognitionandsynthesisonROS.
![Page 214: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/214.jpg)
WorkingwithSpeechrecognitioninROSIndigoandPythonComparedtootherspeechrecognitionmethods,oneoftheeasiestandeffectivemethodstoimplementrealtimespeechrecognitionisPocketSphinxandGStreamerpipeline.WediscussedPocketSphinx,GStreameranditsinterfacingwithPythonpreviously.Next,wecanseeaROSpackagecalledpocketsphinxthatusestheGStreamerpocketsphinxinterfacetoperformspeechrecognition.ThepocketsphinxROSpackageisavailableintheROSrepository.Youwillgetthepackageinformationatthefollowinglink
http://wiki.ros.org/pocketsphinx
InstallationofthepocketsphinxpackageinROSIndigoToinstallthepocketsphinxpackage,firstswitchtothecatkinworkspacesourcefolder.
1. Downloadthesourcecodeofthepocketsphinxpackageusingthefollowingcommand:
$gitclonehttps://github.com/mikeferguson/pocketsphinx
2. Executethecatkin_makecommandfromthecatkinworkspacefoldertobuildthepackage3. Startthespeechrecognizerdemousingthefollowingcommand.Therobotcupdemohassome
basiccommandstodrivetherobot.Wecanchangethecommandbyadaptingacousticandlanguagemodels:
$roslaunchpocketsphinxrobocup.launch
4. Subscribeto/recognizer/outputusingthefollowingcommand:
$rostopicecho/recognizer/output
Thefollowingisthescreenshotoftheoutput:
![Page 215: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/215.jpg)
ThisTopiccanbesubscribedandthecommandcanbeprocessedinsomeothernodes.Inthenextsection,wewillseehowtosynthesizespeechusingROSandPython.
![Page 216: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/216.jpg)
WorkingwithspeechsynthesisinROSIndigoandPythonInROS,therearesomeROSpackagesthatperformspeechsynthesis.Here,wewilldiscussoneROSpackage.ThispackageusesFestivalasthebackend.Thepackagenameissound_play.Ithasnodesandlaunchscriptsthatenablespeechsynthesis.Weneedtoperformthefollowingstepsforspeechsynthesis:
1. Wecaninstallthesound_playpackageusingthefollowingcommand:
$sudoapt-getinstallros-indigo-sound-play
2. Aftertheinstallationofpackage,wehavetocreateasampleROSpackagetointeractwiththesound-playnode.ThefollowingisthecommandtocreateasamplepackageinROSwiththesound-playpackageasdependency:
$catkin_create_pkgsample_ttsrospyroscppsound_playstd_msgs
3. Wehavetocreateasound_playpythonclientcodeforsendingtexttosoundplayservernode.Thisclientwillsendthetextthatneedstobeconvertedtospeechtothesound_playservernode.TheclientwillsendthetexttoconverttospeechinaTopiccalled/robotsound.ThesoundplaynodeinthesoundplaypackagewillsubscribetothisTopicandconvertthestringfromTopictospeech.
4. Createafolderinsidethissample_ttspackagenamedscriptsandcreatethefollowingcodeinthescriptsfolderandnameittest.py.Thecodesnippetsoftest.pyisgiven.
5. Thefollowingcodewillimporttherospyandsound_playmodules.ThisscriptwillactasaSoundClient,whichwillconnecttothesound_playserverandsynthesizethespeech:
#!/usr/bin/envpython
importroslib;roslib.load_manifest('sample_tts')
importrospy,os,sys
fromsound_play.msgimportSoundRequest
fromsound_play.libsoundplayimportSoundClient
6. Thiscodewillinitializethesoundplay_testnodeandcreateanobjectofSoundClient:
if__name__=='__main__':
rospy.init_node('soundplay_test',anonymous=True)
soundhandle=SoundClient()
rospy.sleep(1)
soundhandle.stopAll()
7. Thiscodewillcallafunctiontosynthesizethespeech.Itcanbeusedtosynthesizespeechinanypackagethatincludessound_playasdependency.
print'StartingTTS'
soundhandle.say('Helloworld!')
rospy.sleep(3)
s=soundhandle.voiceSound("HelloWorld")
![Page 217: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/217.jpg)
s.play()
rospy.sleep(3)
8. Thefollowingcommandstartsthesoundplayserver:
$roslaunchsound_playsoundplay_node.launch
9. Thefollowingcommandwillstartthetestscriptforspeechsynthesis:
$rosrunsample_ttstest.py
![Page 218: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/218.jpg)
Questions1. Whatarethebasicproceduresinvolvedinconvertingspeechtotext?2. Whatisthefunctionoftheacousticmodelandlanguagemodelinspeechrecognition?3. Whatarethebasicproceduresinvolvedinconvertingtexttospeech?4. Whataretheproceduresinvolvedinphoneticanalysisandprosodicanalysis?5. HowcanweimprovetherecognitionaccuracyofSphinxandJulius?
![Page 219: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/219.jpg)
SummaryThemainaimofthischapterwastodiscussspeechrecognitionandsynthesisandhowwecanimplementitonourrobot.Byaddingspeechfunctionalitiesinourrobot,wecanmaketherobotmoreinteractivethanbefore.Wesawwhataretheprocessesinvolvedinthespeechrecognitionandsynthesisprocess.Wealsosawtheblockdiagramoftheseprocessesandthefunctionsofeachblock.Afterdiscussingtheblocks,wesawsomeinterestingspeechrecognitionframeworks(suchasSphinx/PocketSphinx,Julius,andWindowsSpeechSDKandsynthesislibrariessuchaseSpeakandFestival).Afterdiscussingtheselibraries,wediscussedandworkedwiththePythoninterfacingofeachlibrary.Towardstheendofthischapter,wediscussedandworkedwiththeROSpackagesthatperformspeechrecognitionandsynthesisfunctionalities.
![Page 220: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/220.jpg)
Chapter9.ApplyingArtificialIntelligencetoChefBotUsingPythonInthepreviouschapter,wehavediscussedandimplementedspeechrecognitionandspeechsynthesisusingPythonandROS.Inthischapter,wewilldiscusshowtoapplyAItoChefBottocommunicatewithpeopleintelligently,likeahuman.Actually,thesefeaturesareadd-onsinChefBot,whichcanincreasehuman-robotinteractionandmaketherobotresembleahumanfoodsupplier.Inthischapter,wewillmainlycoverthefollowingtopics:
BlockdiagramofChefBot'scommunicationsystemIntroductiontoAIMLandPyAIMLInterfacingChefBot'sAImoduletoROS
AI(ArtificialIntelligence)canbedefinedastheintelligentbehaviorexhibitedbycomputersormachines.UsingAI,wecancreateavirtualintelligenceinmachinestoperformaspecifictasklikeahuman.Inthischapter,wewilluseasimplemethodtoapplyAItotherobot.Thisintelligencewillworkusingpatternmatchingandsearchingalgorithms.Theinput-outputdialogpatternsbetweentheuserandrobotarestoredinfilescalledArtificialIntelligenceMarkupLanguage(AIML)andwewillinterpretthesestoredpatternsusingaPythonmodulecalledPyAIML.TheusercanstoredesiredpatternsinAIMLfilesandtheinterpretermodulewillsearchforappropriateresponsefromthedataset.Wecandevelopourownpatterndatasetforourrobotsimplybywritinglogicalpatterns.Inonesectionofthischapter,wewillseehowtowriteAIMLtagsandpatternsinarobot.Inthefirstsection,wewilldiscusswherewewilluseAIinChefBot.
BlockdiagramofthecommunicationsysteminChefBotThefollowingblockdiagramshowshowChefBotcommunicatesandinteractswithhumansusingspeech:
![Page 221: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/221.jpg)
Robotcommunicationblockdiagram
Therobotcanconverthumanspeechtotextusingthespeechrecognitionsystemandcanconverttextualdatatospeechusingspeechsynthesis.Wehavealreadydiscussedtheseprocessesinthepreviouschapter.TheAIwewilldiscusshereiscontainedinbetweenthesetwoblocks.Afterreceivingthetextdatafromaspeechtotextconversionstage,itissenttotheAIMLinterpreter.TheAIMLinterpreterretrievesthemostmeaningfulreplyfromtheAIMLdataset.Thedatasetoftherobotcanbeanything,suchasfooddetails,casualtalks,andsoon.TheusercanwriteanykindofpatterninAIMLfiles.InthecaseofChefBot,theusercanaskaboutfooddetailsorcancommandtherobottodosomething.Therobotcommandsystemcheckswhethertheconvertedtextisacommandtotherobot.Ifit'sacommand,itisdirectlysenttothehardwarenodestoexecute.TheoutputtextfromAIMLwillbeconvertedtospeechusingatext-to-speechsystem.Also,wecanputanLCDdisplaytoshowtheresultofspeechintheformofananimatedface.Allthesefeaturesenhancetherobot'sinteraction.BeforewediscussthePythoninterpreterforAIML,wewilldiscussAIMLandAIMLtagsindetail.
![Page 222: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/222.jpg)
IntroductiontoAIMLAIMLfilesareasubsetofExtensibleMark-upLanguage(XML)thatcanstoredifferenttextpatternsintheformoftags.AIMLwasdevelopedbytheAlicebotfreesoftwarecommunity(http://www.alicebot.org/).AIMLismainlyusedtoimplementChatbots,anaturallanguagesoftwareagentinwhichausercanaskquestionstotherobotanditcangiveanintelligentreply.ThissametechniqueisusedinChefBot.Usingspeechrecognition,therobotgetsinputtextfromtheuserandanAIMLinterpreter;asoftwareprogramthatcaninterpretAIMLfilesandretrieveanintelligentreplyfromtheAIMLdataset.Thereplywillbeconvertedtospeech.AIMLfilesmainlyconsistofvarioustags.HereareasetofcommonlyusedAIMLtags.
IntroductiontoAIMLtagsAIMLfilesconsistofasetofcommonlyusedAIMLtags.Let'stakealookatthem.
The<aiml>tag:EachAIMLcodebeginswiththistagandisclosedusingthe</aiml>tag.Thistagalsoconsistsofattributessuchastheversionandencodingschemeofthefile.TheAIMLfilecanparseevenwithouttheseattributes,butthiswillbeusefulinbigprojects.TheversionattributecorrespondstothecurrentversionofAIMLthatwewilluse.Theencodingattributeisthetypeofcharacterencodingwewilluseinthisfile.ThefollowingcodesnippetshowsanexampleusageofAIMLtags:
<aimlversion="1.0.1"encoding="UTF-8">
...
</aiml>
The<category>tag:ThebasicknowledgeblocksofAIMLarecalledcategories.Eachcategoryblockconsistsoftwosections.Oneistheuserinputintheformofasentenceandtheotherisacorrespondingresponsetouserinputwhichcomesfromrobot.Thecategorytagisrepresentedusingtheopening<category>tagandtheclosingtagisrepresentedusingthe</category>tag.Thesecategoriesmustbeinsidethe<aiml>and</aiml>tags.Thecategorytagsconsistoftwotags,namely,the<pattern>tagandthe<template>tag.Theinputgivenbyusersisinsidethe<pattern>tagandtheanswersareinthe<template>tag.Forexample,lookatthisfollowingconversation:
User:Howareyou?
Robot:Iamfine.
Inthisconversation,theuserdialogwillbeinthe<pattern>tagandtherobot'sresponsewillbeinthe<template>tag.ThefollowingcodeshowstherepresentationoftheprecedingdialogsintheAIMLformat:
<aimlversion="1.0.1"encoding="UTF-8">
<category>
<pattern>HOWAREYOU</pattern>
<template>IAMFINE</template>
</category>
![Page 223: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/223.jpg)
</aiml>
Weneedtosavethisfileinthe.aimlor.xmlformat.
The<pattern>tag:Thepatterntagcomprisesofpossibleuserinputs.Therewillbeonlyone<pattern>taginacategoryblock.The<pattern>tagwillbethefirstelementofthe<category>tagand,inthe<pattern>tag,wordsareseparatedbysinglespaces.Thesentencein<pattern>tagmayhavewordsorwildcardssuchas"*"or"_",whichcanreplaceastringinthisposition.Intheprecedingexample,the<pattern>HOWAREYOU</pattern>codeindicatesthepossibleuserinputinthiscategory.
The<template>tag:The<template>tagcomprisesofpossibleanswersfortheuserinput.The<template>tagwillbewithinthe<category>tagandwillbeplacedafterthe<pattern>tag.The<template>tagcansaveaparticularansweroritcantriggerprograms.Also,wecangiveconditionalformofanswerstoo.Intheprecedingcode,the<template>tagsentence:"IAMFINE"willbetheanswerforthe"HOWAREYOU"pattern.Wecaninsertadditionaltagsinthe<template>tag.Thefollowingtagsareusedinthe<template>tag:
The<starindex="n"/>tag:Thistagisusedtoextractapartoftheusertextinputsentence.Thenindexindicateswhichfragmentoftexthastobeextractedandtakenfromtheentiresentence:
<starindex="1"/>:Thisindicatesthefirstfragmentofasentence<starindex="2"/>:Thisindicatesthesecondfragmentofasentence
Themainapplicationofthistagistoextractandstorethetextfromuserinput.Thefollowingisadialogbetweentherobotandtheuser.Thewildcardcanbeanything,suchasanameorsomethingelse.Usingthistag,wecanextractthiswildcardportionanduseitintheansweringsection:
User:Mynameis*
Robot:Nicetomeetyou*
So,iftheusersays,"MynameisLentin",thentherobotwillreply,"NicetomeetyouLentin".Thiskindofconversationisonlypossibleusingthe<star>tagandwildcardssuchas"*".ThecompleteAIMLexampleusingthestartagisasfollows:
<aimlversion="1.0.1"encoding="UTF-8">
<category>
<pattern>MYNAMEIS*</pattern>
<template>
NICETOMEETYOU<star/>
</template>
</category>
<category>
<pattern>MEETOURGUEST*AND*</pattern>
<template>
NICETOMEETYOU<starindex="1"/>AND<starindex="2"/>.
</template>
</category>
</aiml>
![Page 224: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/224.jpg)
IfweloadthisexampletotheAIMLinterpreter,wewillgetthefollowingreplywhenwegivethefollowinginput:
USER:MYNAMEISLENTIN
ROBOT:NICETOMEETYOULENTIN
Thepreviousconversationusesonewildcardandthefollowingconversationwillusebothwildcards:
USER:MEETOURGUESTTOMANDJOSEPH
ROBOT:NICETOMEETYOUTOMANDJOSEPH
Here,thenameis"TOM",theindexnumberis"1",and"JOSEPH"isindexedas"2".
The<srai>tag:Usingthe<srai>tag,wecantargetmultiplepatternsfromasingle<template>tag.Usingthe<srai>tag,theAIMLinterpretercansearchrecursivelyfortheanswerthatisreplacingthecurrenttemplatetextwiththetemplatetextofanotherpattern.Thefollowingcodeisanexampleoftheusageofthe<srai>tag:
<aimlversion="1.0.1"encoding="UTF-8">
<category>
<pattern>WHATISAROBOT?</pattern>
<template>
AROBOTISAMACHINEMAINLYDESIGNEDFOREXECUTINGREPEATEDTASKWITHSPEEDAND
PRECISION.
</template>
</category>
<category>
<pattern>DOYOUKNOWWHATA*IS?</pattern>
<template>
<srai>WHATISA<star/></srai>
</template>
</category>
</aiml>
Whenauseraskstherobot,"DOYOUKNOWWHATAROBOTIS",itwillgotothesecondcategorytemplateandextractthewildcardsectionfromuserinput,"ROBOT"andputthecompletesentence,"WHATISAROBOT",andputitinthe<srai>tag.The<srai>tagcancallthepatterncalled"WHATISAROBOT"andfilltheoutputofthistemplatetotheactualtemplatetext.
FormoreAIMLtags,youcanrefertohttp://www.alicebot.org/documentation/aiml-reference.html.
AfterdiscussingAIMLfilesandtags,wewilldiscussthePythonAIMLinterpretertodecodeAIMLfiles.TheAIMLinterpretercanretrievethetemplatetextfromuserinput.WewilluseaPythonmodulecalledPyAIMLtointerpretAIMLfiles.Let'sdiscussmoreaboutPyAIML.
![Page 225: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/225.jpg)
IntroductiontoPyAIMLPyAIMLisanopensourcePythonAIMLinterpreterwrittencompletelyinpurePythonwithoutusinganythird-partydependencies.ThemodulewillreadallthepatternsofAIMLfrommemoryandbuildadirectedpatterntree.Thebacktrackingdepth-firstsearchalgorithmisimplementedinthismoduleforpatternmatching.
Now,wecancheckwhetherwecaninstallPyAIMLonoursystem.ThePyAIMLmodulecanbeinstalledonLinux,Windows,andMacOSX.ThereareprebuiltbinariesofPyAIMLavailableonUbuntuandthesourcecodeofthismoduleisalsoavailableonGitHub.Currently,weareworkingwithPythonversion2.7,oranythinglessthan2.8,toinstallPyAIML.
InstallingPyAIMLonUbuntu14.04.2PyAIMLcanbeinstalledonUbuntuusingtheapt-getcommand.ThebinariesareavailableontheUbuntupackagerepositories.ThePythonversionweareworkingwithis2.7.6andthePyAIMLversionweareinstallingis0.86.ThefollowingcommandwillinstallPyAIMLonUbuntu14.04.2:
$sudoapt-getinstallpython-aiml
YoushouldinstallGittogetthesourcecode.Also,youshouldhavePythonversion2.7orgreaterthan2.7andlessthan2.8.Werequirethelatestversionofthemoduleifweareperformingtheinstallationviasourcecode.
InstallingPyAIMLfromsourcecodeWecanretrievethesourcecodemoduleusingthefollowinggitcommand:
$gitclonegit://pyaiml.git.sourceforge.net/gitroot/pyaiml/pyaiml
Aftercloningthecode,changethedirectorytoacloneddirectorynamedpyaiml:
$cdpyaiml
Installthemoduleusingthefollowingcommand:
$sudopythonsetup.pyinstall
![Page 226: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/226.jpg)
WorkingwithAIMLandPythonTocheckwhetherthemoduleisproperlyinstalledonyourmachine,openaPythonIDLEandimporttheaimlmodule:
>>>importaiml
Ifthemoduleisimportedcorrectly,itwillnotshowanyerrorandcomestothenextline.Then,wecanconfirmthattheinstallationiscorrect.
ThemostimportantclasswearehandlingintheaimlmoduleisKernel().WearemainlyusingthisclasstolearnfromAIMLfilesandgetaresponsefromtheAIMLdatasettouserinput.Thefollowinglinewillcreateanobjectoftheaiml.Kernel()class:
>>>mybot=aiml.Kernel()
AftercreatingtheKernel()object,wecanassigntherobotnameusingthefollowingcommand.WewillassignChefbotasthenameforthisrobot:
>>>mybot.setBotPredicate("name","Chefbot")
ThenextstepistolearnAIMLfiles.WecanloadeitherasingleAIMLfileoragroupofAIMLfiles.ToloadasingleAIMLfile,wecanusethefollowingcommand.Notethatthesample.aimlfilemustbeinthecurrentpath:
>>>mybot.learn('sample.aiml')
Theprecedingcommandwillloadthesample.aimlfileintomemoryandtheoutputisasfollows:
Loadingsample.aiml...done(0.01seconds)s
IfyouwanttolearnmorethanoneAIML,it'sbettertouseanAIML/XMLfile,forexample,thestartup.xmlfilecanloadallotherAIMLfiles.Wewillseehowstartup.xmlworksinthenextsection.Tolearnstartup.xml,usethefollowingcommand:
>>>mybot.learn("startup.xml")
Afteryoulearnstartup.xml,let'striggerapatterninstartup.xmlcalled"LOADAIMLB".Whenwecallthispattern,itwillrespondbylearningallAIMLfilesandprinttheresponseinstringafterlearningeachAIMLfile:
>>>mybot.respond("loadaimlb")
AfterlearningAIMLfiles,wecanstartinputtingtexttothekernelobjectandretrievetheintelligenttextusingthefollowingcode:
>>>whileTrue:printk.respond(raw_input(">"))
ThecompletePythoncodetoloadoneAIMLfileandgetaresponsefromtheAIMLdatasetisgiven
![Page 227: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/227.jpg)
inthecodesnippetinthefollowingsection.WecanpasstheAIMLfileasacommand-lineargument.
LoadingasingleAIMLfilefromthecommand-lineargumentWecanloadasingleAIMLfileusingthefollowingcode:
#!/usr/bin/envpython
importaiml
importsys
mybot=aiml.Kernel()
mybot.learn(sys.argv[1])
whileTrue:
printmybot.respond(raw_input("Enterinput>"))
ThefollowingisthesampleAIMLfilerequiredtoloadinthiscode.Savethefollowingcodewiththenamesample.aiml:
<aimlversion="1.0.1"encoding="UTF-8">
<category>
<pattern>HOWAREYOU</pattern>
<template>IAMFINE</template>
</category>
</aiml>
Savethecodeaschatbot.pyandchangethepermissionofthecodeusingthefollowingcommand:
$chmod+xchatbot.py
Executethecodeusingthefollowingcommand:
$./chatbotsample.aiml
Itwillgiveyouthefollowingresult:
PressCtrl+Ctoquitthedialog.WecancheckwhethereachAIMLexamplewepreviouslydiscussedcanbetestedusingthisexample.
IfthereismorethanoneXMLfile,wecanusethefollowingcodetoloadalltheAIMLfilesintomemory.Let'sdownloadsomeAIMLdatasetfromtheArtificialLinguisticInternetComputer
![Page 228: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/228.jpg)
Entity(A.L.I.C.E)robotandloaditusingPyAIML.
![Page 229: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/229.jpg)
WorkingwithA.L.I.C.E.AIMLfilesTheAIMLfilesofA.L.I.C.E.chatterarefreelyavailableathttps://code.google.com/p/aiml-en-us-foundation-alice/.
ExtracttheAIMLfilestoafolderonthedesktoporinthehomedirectoryandcopystartup.xmltotheseAIMLfiles.Thisstartup.xmlfilewillloadalltheotherAIMLfilesintomemory.Thefollowingcodeisanexampleofatypicalstartup.xmlfile:
<aimlversion="1.0">
<category>
<pattern>LOADAIMLB</pattern>
<template>
<!--LoadstandardAIMLset-->
<learn>*.aiml</learn>
</template>
</category>
</aiml>
TheprecedingXMLfilewilllearnalltheAIMLfileswhenwecalltheLOADAIMLBpattern.
LoadingAIMLfilesintomemoryThefollowingcodewillloadalltheAIMLfilesintomemory:
#!/usr/bin/envpython
importaiml
importsys
importos
#Changethecurrentpathtoyouraimlfilespath
os.chdir('/home/lentin/Desktop/aiml-files')
mybot=aiml.Kernel()
#Learnstartup.xml
mybot.learn('startup.xml')
#CallingloadaimlbforloadingallAIMLfiles
mybot.respond('loadaimlb')
whileTrue:
printmybot.respond(raw_input("Enterinput>"))
Youwillgetthefollowingoutput:
![Page 230: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/230.jpg)
LoadingAIMLfileswilltakesometime.Toavoidinitialloadingtime,wecandumptheAIMLpatternsloadedinthememoryandsavethemtobrainfiles.Loadingbrainfileswillsaveinitialloadingtime.
LoadingAIMLfilesandsavingtheminbrainfilesThefollowingcodewillloadAIMLfilesandsavetheminbrainfiles:
#!/usr/bin/envpython
importaiml
importsys
importos
os.chdir('/home/lentin/Desktop/aiml-files')
mybot=aiml.Kernel()
mybot.learn('startup.xml')
mybot.respond('loadaimlb')
#Savingloadedpatternsintoabrainfile
mybot.saveBrain('standard.brn')
whileTrue:
printmybot.respond(raw_input("Enterinput>"))
Youwillgetthefollowingoutput:
![Page 231: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/231.jpg)
IfwewanttoinitializetherobotfromthebrainfileorAIMLfiles,abetterwayistousethebootstrap()methodinsidetheKernel()class.Thebootstrap()methodtakesthebrainfileorAIMLfilesandsomecommandasargument.Thecodeforloadingthebrainfileifitexistsisinthefollowingsection;otherwise,loaditfromAIMLandsaveanewbrainfile.Afterthisprocessiscomplete,respondfromtheloadeddataset.
LoadingAIMLandbrainfilesusingtheBootstrapmethodThefollowingcodewillloadAIMLfilesandbrainfilesusingtheBootstrapmethod:
#!/usr/bin/envpython
importaiml
importsys
importos
#Changingcurrentdirectorytothepathofaimlfiles
#Thispathwillchangeaccordingtoyourlocationofaimlfiles
os.chdir('/home/lentin/Desktop/aiml-files')
mybot=aiml.Kernel()
#Ifthereisabrainfilenamedstandard.brn,Kernel()willinitializeusing
bootstrap()method
ifos.path.isfile("standard.brn"):
mybot.bootstrap(brainFile="standard.brn")
else:
#Ifthereisnotbrainfile,loadallAIMLfilesandsaveanewbrain
mybot.bootstrap(learnFiles="startup.xml",commands="loadaimlb")
mybot.saveBrain("standard.brn")
#ThisloopaskforresponsefromuserandprinttheoutputfromKernel()object
whileTrue:
printmybot.respond(raw_input("Enterinput>"))
![Page 232: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/232.jpg)
IntegratingPyAIMLintoROSInthissection,wearegoingtodevelopROSPythonnodeswhichcanhandleAIMLfiles.WeareusingthePythoncodethatwedevelopedintheprecedingsection.TheROSversionwearegoingtouseisIndigoandtheUbuntuversionwewilluseis14.04.2.WealreadydiscussedtheinterfacingofspeechrecognitionandspeechsynthesisinROSandalsodiscussedthePythoncodetointerfaceAIMLfiles.Inthissection,wewillmakeapackageinROStohandleAIMLfiles.Currently,therearenoactivepackagesinROSrepositoriesthatcanhandleAIMLfiles.Wewillbuildourownpackageusingthecodethatwedevelop.
CreateaROSpackageusingthefollowingdependencies.Herethesound_playpackageisusedforspeechsynthesis:
$catkin_create_pkgros_aimlrospystd_msgssound_play
Createascriptsfolderinsidetheros_aimlpackageandcreatethefollowingPythonfilesinit.CreateafoldercalleddataandcopytheALICEAIMLdatasetwehavealreadydownloadedtothisfolder.
aiml_server.pyThefollowingcodeactsasanAIMLserverinwhichanAIMLclientcansenduserinputtotheserverthroughthe/chattertopicandretrievetheAIMLoutputresponsethroughthe/responsetopic:
#!/usr/bin/envpython
importrospy
importaiml
importos
importsys
fromstd_msgs.msgimportString
rospy.init_node('aiml_server')
mybot=aiml.Kernel()
#CreatingaROSpublisherforthe/responsetopic
response_publisher=rospy.Publisher('response',String,queue_size=10)
#FunctiontoloadAIMLfilesusingbootstrap()method
defload_aiml(xml_file):
#Getthepathofaimldataset.Wehavetomentionthispathonlaunchfileasa
rosparameter
data_path=rospy.get_param("aiml_path")
os.chdir(data_path)
ifos.path.isfile("standard.brn"):
mybot.bootstrap(brainFile="standard.brn")
else:
mybot.bootstrap(learnFiles=xml_file,commands="loadaimlb")
mybot.saveBrain("standard.brn")
#Callbackfunctionof/chattertopic.Itwillreceiveinputfromuserandfeedto
respond()methodofKernel()object.andprinttheresults
defcallback(data):
![Page 233: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/233.jpg)
input=data.data
response=mybot.respond(input)
rospy.loginfo("Iheard::%s",data.data)
rospy.loginfo("Ispoke::%s",response)
response_publisher.publish(response)
#Methodtocreatesubscriberin/chattertopic
deflistener():
rospy.loginfo("StartingROSAIMLServer")
rospy.Subscriber("chatter",String,callback)
#spin()simplykeepspythonfromexitinguntilthisnodeisstopped
rospy.spin()
if__name__=='__main__':
load_aiml('startup.xml')
listener()
aiml_client.pyThisisasimpleclientcodethatwillsenduserinputtakenfromthekeyboardtotheAIMLserver.Theuserinputissendthroughthe/chattertopic:
#!/usr/bin/envpython
importrospy
fromstd_msgs.msgimportString
#Creatingapublisherforchattertopic
pub=rospy.Publisher('chatter',String,queue_size=10)
rospy.init_node('aiml_client')
r=rospy.Rate(1)#10hz
whilenotrospy.is_shutdown():
#Receivingtextinputfromuser
input=raw_input("Enteryourtext:>")
#Publishingtochattertopic
pub.publish(input)
r.sleep()
aiml_tts_client.pyThisclientwilltranscribetheresponsefromaiml_serverandconvertittospeech.Thiscodeisadaptedfromtheclientcode,aswediscussedinthepreviouschapterforspeechsynthesis,aswediscussedinthepreviouschapter.Wewillusethesound_playpackagetoperformTTS:
#!/usr/bin/envpython
importrospy,os,sys
fromsound_play.msgimportSoundRequest
fromsound_play.libsoundplayimportSoundClient
fromstd_msgs.msgimportString
rospy.init_node('aiml_soundplay_client',anonymous=True)
soundhandle=SoundClient()
rospy.sleep(1)
soundhandle.stopAll()
![Page 234: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/234.jpg)
print'StartingTTS'
#Callbackmethodtoreceivetextfrom/responsetopicandconverttospeech
defget_response(data):
response=data.data
rospy.loginfo("Response::%s",response)
soundhandle.say(response)
#Methodtocreateasubscriberfor/responsetopic.
deflistener():
rospy.loginfo("Startinglisteningtoresponse")
rospy.Subscriber("response",String,get_response,queue_size=10)
rospy.spin()
if__name__=='__main__':
listener()
aiml_speech_recog_client.pyThisclientcansendthespeechtotextdatatotheAIMLserverinsteadoftypingonthekeyboard.Beforerunningthiscode,wehavetolaunchthePocketSphinxspeechrecognizer.Wecanseehowtorunthiscodeafterdiscussingit:
#!/usr/bin/envpython
importrospy
fromstd_msgs.msgimportString
rospy.init_node('aiml_speech_recog_client')
pub=rospy.Publisher('chatter',String,queue_size=10)
r=rospy.Rate(1)#10hz
#Theoutputofpocketsphinxpackageissendingconvertedtextto/recognizer/output
topic.Thefollowingfunctionisthecallbackofthistopic.Thetextwillreceive
andsendthrough/chattertopic,whichisreceivedbyAIMLserver
defget_speech(data):
speech_text=data.data
rospy.loginfo("Isaid::%s",speech_text)
pub.publish(speech_text)
#Creatingasubscriberforpocketsphinxoutputtopic/recognizer/output
deflistener():
rospy.loginfo("StartingSpeechRecognition")
rospy.Subscriber("/recognizer/output",String,get_speech)
rospy.spin()
if__name__=='__main__':
listener()
Let'sseehowthesenodescommunicatewiththeAIMLserver:
![Page 235: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/235.jpg)
Aftercreatingallthescriptsinthescriptsfolder,createanotherfoldercalledlaunchintheros_aimlpackagetostorethelaunchfiles.Ithelpstolaunchallthenodesinasinglerun.Createthefollowinglaunchfilesinthelaunchfolder.
start_chat.launchThislaunchfilewilllaunchaiml_server.pyandaiml_client.py,inwhichtheuserwillreceivetheinputastextandtheresponseastext.Theaiml_pathROSparameterhastomentiontheminthelaunchfile:
<launch>
<paramname="aiml_path"value="/home/lentin/catkin_ws/src/ros_aiml/data"/>
<nodename="aiml_server"pkg="ros_aiml"type="aiml_server.py"output="screen">
</node>
<nodename="aiml_client"pkg="ros_aiml"type="aiml_client.py"output="screen">
</node>
</launch>
start_tts_chat.launchThislaunchfilewilllaunchthetextinputandspeechsynthesisoftheAIMLresponse:
<launch>
<paramname="aiml_path"value="/home/lentin/catkin_ws/src/ros_aiml/data"/>
<nodename="aiml_server"pkg="ros_aiml"type="aiml_server.py"output="screen">
</node>
<includefile="$(findsound_play)/soundplay_node.launch">
![Page 236: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/236.jpg)
</include>
<nodename="aiml_tts"pkg="ros_aiml"type="aiml_tts_client.py"output="screen">
</node>
<nodename="aiml_client"pkg="ros_aiml"type="aiml_client.py"output="screen">
</node>
</launch>
start_speech_chat.launchThisfilewilllaunchthespeechrecognitionclient,synthesisclient,andAIMLserver.Thiswillnotlaunchpocketsphinx;weneedtolaunchitseparately.Thislaunchfileenablesyoutoreceivetextinputfromthespeechrecognizerandconverttheresponsetospeechtoo:
<launch>
<paramname="aiml_path"value="/home/lentin/catkin_ws/src/ros_aiml/data"/>
<nodename="aiml_server"pkg="ros_aiml"type="aiml_server.py"output="screen">
</node>
<includefile="$(findsound_play)/soundplay_node.launch"></include>
<nodename="aiml_tts"pkg="ros_aiml"type="aiml_tts_client.py"output="screen">
</node>
<nodename="aiml_speech_recog"pkg="ros_aiml"type="aiml_speech_recog_client.py"
output="screen">
</node>
</launch>
Aftercreatingallthelaunchfiles,changethepermissionofallthelaunchfilesusingthefollowingcommand,whichwehavetoexecuteinthelaunchfolder:
$chmod+x*.launch
Thefolderstructureofthispackageisgiveninthefollowingdiagram.Aftercreatingthispackage,verifyitwiththisdiagram:
![Page 237: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/237.jpg)
LaunchtheAIMLserverandclientfortextchattingusingthefollowingcommand:
$roslaunchros_aimlstart_chat.launch
Whenyoulaunchthisfile,theusercaninputthetextandthereplywillbeprinted.
LaunchtheAIMLserverandclientwithtextchattingandspeechsynthesis,usingthefollowingcommand:
$roslaunchros_aimlstart_tts_chat.launch
Whenyoulaunchthisfile,itwillstartthetextchattinginterfaceandthereplytextwillbesynthesizedusingthespeechsynthesizer.
Startthepocketsphinxdemolaunchfiletostartspeechrecognition.Thefollowinglaunchfileisademothatwilldetectsomewordsandsentences.Ifwewantmoreaccuracy,wehavetotrainthemodel:
$roslaunchpocketsphinxrobocup.launch
LaunchtheAIMLserver,speechrecognition,andsynthesisclientusingthefollowingcommand:
$roslaunchros_aimlstart_speech_chat.launch
Afteryouruntheprecedinglaunchfiles,theusercaninteractwiththeAIMLserverusingspeechandtheuserwillgettheoutputasspeechaswell.
![Page 238: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/238.jpg)
Questions1. WhatisArtificialIntelligence?2. WhatistheuseofanAIMLfile?3. WhicharethemostcommonlyusedAIMLtags?4. WhatistheuseofthePyAIMLmodule?
![Page 239: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/239.jpg)
SummaryInthischapter,wediscussedhowtoaddArtificialIntelligencetoChefBotinordertointeractwithpeople.Thisfunctionisanadd-ontoChefBottoincreasetheinteractivityoftherobot.WeusedsimpleAItechniquessuchaspatternmatchingandsearchinginChefBot.ThepatterndatasetsarestoredinaspecialtypeoffilecalledAIML.ThePythoninterpretermoduleiscalledPyAIML.WeusedthistodecodeAIMLfiles.TheusercanstorethepatterndatainanAIMLformatandPyAIMLcaninterpretthispattern.Thismethodissimilartoastimulus-responsesystem.TheuserhastogiveastimulusintheformoftextdataandfromtheAIMLpattern,themodulefindstheappropriatereplytotheuserinput.Wesawtheentirecommunicationsystemoftherobotandhowtherobotcommunicateswithpeople.ItincludesspeechrecognitionandsynthesisalongwithAI.Wealreadydiscussedspeechinthepreviouschapter.WealsosawusefultagsusedinAIMLandthePyAIMLinstallation,howtheywork,andsomeexamples.Finally,weimplementedtheentirecodeinROSalongwiththespeechrecognitionandsynthesisunits.Inthenextchapter,wewilldiscusstheintegrationofcomponentsintherobot,whichwehavenotdiscusseduntilnow.
![Page 240: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/240.jpg)
Chapter10.IntegrationofChefBotHardwareandInterfacingitintoROS,UsingPythonInChapter2,MechanicalDesignofaServiceRobot,wesawtheChefBotchassisdesignandnowwehavegotthemanufacturedpartsofthisrobot.Inthischapter,wewillseehowtoassemblethisrobotusingthesepartsandalsothefinalinterfacingofsensorsandotherelectronicscomponentsofthisrobottoTivaCLaunchPad.WehavealreadydiscussedinterfacingofindividualrobotcomponentsandsensorswithLaunchpad.Inthischapter,wewilltrytointerfacethenecessaryroboticcomponentsandsensorsofChefBotandprogramitinsuchawaythatitwillreceivethevaluesfromallsensorsandcontroltheinformationfromthePC.LaunchpadwillsendallsensorvaluesviaaserialporttothePCandalsoreceivecontrolinformation(suchasresetcommand,speed,andsoon)fromthePC.
AfterreceivingsensorvaluesfromthePC,aROSPythonnodewillreceivetheserialvaluesandconvertittoROSTopics.TherearePythonnodespresentinthePCthatsubscribetothesensor'sdataandproducesodometry.ThedatafromthewheelencodersandIMUvaluesarecombinedtocalculatetheodometryoftherobotanddetectobstaclesbysubscribingtotheultrasonicsensorandlaserscanalso,controllingthespeedofthewheelmotorsbyusingthePIDnode.Thisnodeconvertsthelinearvelocitycommandtodifferentialwheelvelocity.Afterrunningthesenodes,wecanrunSLAMtomaptheareaandafterrunningSLAM,wecanruntheAMCLnodesforlocalizationandautonomousnavigation.
Inthefirstsectionofthischapter,BuildingChefBothardware,wewillseehowtoassembletheChefBothardwareusingitsbodypartsandelectronicscomponents.
BuildingChefBothardwareThefirstsectionoftherobotthatneedstobeconfiguredisthebaseplate.Thebaseplateconsistsoftwomotorsanditswheels,casterwheels,andbaseplatesupports.Thefollowingimageshowsthetopandbottomviewofthebaseplate:
![Page 241: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/241.jpg)
Baseplatewithmotors,wheels,andcasterwheels
Thebaseplatehasaradiusof15cmandmotorswithwheelsaremountedontheoppositesidesoftheplatebycuttingasectionfromthebaseplate.Arubbercasterwheelismountedontheoppositesideofthebaseplatetogivetherobotgoodbalanceandsupportfortherobot.Wecaneitherchooseballcasterwheelsorrubbercasterwheels.Thewiresofthetwomotorsaretakentothetopofthebaseplatethroughaholeinthecenterofthebaseplate.Toextendthelayersoftherobot,wewillputbaseplatesupportstoconnectthenextlayers.Now,wecanseethenextlayerwiththemiddleplateandconnectingtubes.Therearehollowtubes,whichconnectthebaseplateandthemiddleplate.Asupportisprovidedonthebaseplateforhollowtubes.Thefollowingfigureshowsthemiddleplateandconnectingtubes:
![Page 242: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/242.jpg)
Middleplatewithconnectingtubes
Theconnectingtubeswillconnectthebaseplateandthemiddleplate.Therearefourhollowtubesthatconnectthebaseplatetothemiddleplate.Oneendofthesetubesishollow,whichcanfitinthebaseplatesupport,andtheotherendisinsertedwithahardplasticwithanoptiontoputascrewinthehole.Themiddleplatehasnosupportexceptfourholes:
![Page 243: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/243.jpg)
Fullyassembledrobotbody
Themiddleplatemaleconnectorhelpstoconnectthemiddleplateandthetopofthebaseplatetubes.Atthetopofthemiddleplatetubes,wecanfitthetopplate,whichhasfoursupportsontheback.Wecaninsertthetopplatefemaleconnectorintothetopplatesupportandthisishowwewillgetthefullyassembledbodyoftherobot.
ThebottomlayeroftherobotcanbeusedtoputthePrintedCircuitBoard(PCB)andbattery.Inthemiddlelayer,wecanputKinectandIntelNUC.Wecanputaspeakerandamicifneeded.Wecanusethetopplatetocarryfood.ThefollowingfigureshowsthePCBprototypeofrobot;itconsistsofTivaCLaunchPad,amotordriver,levelshifters,andprovisionstoconnecttwomotors,ultrasonic,andIMU:
![Page 244: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/244.jpg)
ChefBotPCBprototype
Theboardispoweredwitha12Vbatteryplacedonthebaseplate.ThetwomotorscanbedirectlyconnectedtotheM1andM2maleconnectors.TheNUCPCandKinectareplacedonthemiddleplate.TheLaunchpadboardandKinectshouldbeconnectedtotheNUCPCviaUSB.ThePCandKinectarepoweredusingthesame12Vbatteryitself.Wecanusealead-acidorlithium-polymerbattery.Here,weareusingalead-acidcellfortestingpurposes.Wewillmigratetolithium-polymerforbetterperformanceandbetterbackup.ThefollowingfigureshowsthecompleteassembleddiagramofChefBot:
![Page 245: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/245.jpg)
Fullyassembledrobotbody
Afterassemblingallthepartsoftherobot,wewillstartworkingwiththerobotsoftware.ChefBot'sembeddedcodeandROSpackagesareavailableinGitHub.Wecanclonethecodeandstartworkingwiththesoftware.
![Page 246: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/246.jpg)
ConfiguringChefBotPCandsettingChefBotROSpackagesInChefBot,weareusingIntel'sNUCPCtohandletherobotsensordataanditsprocessing.AfterprocuringtheNUCPC,wehavetoinstallUbuntu14.04.2orthelatestupdatesof14.04LTS.AftertheinstallationofUbuntu,installcompleteROSanditspackageswementionedinthepreviouschapters.WecanconfigurethisPCseparately,andafterthecompletionofallthesettings,wecanputthisintotherobot.ThefollowingaretheprocedurestoinstallChefBotpackagesontheNUCPC.
CloneChefBot'ssoftwarepackagesfromGitHubusingthefollowingcommand:
$gitclonehttps://github.com/qboticslabs/Chefbot_ROS_pkg.git
WecanclonethecodeinourlaptopandcopythechefbotfoldertoIntel'sNUCPC.ThechefbotfolderconsistsoftheROSpackagesofChefBot.IntheNUCPC,createaROScatkinworkspace,copythechefbotfolderandmoveitinsidethesrcdirectoryofthecatkinworkspace.
BuildandinstallthesourcecodeofChefBotbysimplyusingthefollowingcommandThisshouldbeexecutedinsidethecatkinworkspacewecreated:
$catkin_make
IfalldependenciesareproperlyinstalledinNUC,thentheChefBotpackageswillbuildandinstallinthissystem.AftersettingtheChefBotpackagesontheNUCPC,wecanswitchtotheembeddedcodeforChefBot.Now,wecanconnectallthesensorsinLaunchpad.AfteruploadingthecodeinLaunchpad,wecanagaindiscussROSpackagesandhowtorunit.TheclonedcodefromGitHubcontainsTivaCLaunchPadcode,whichisgoingtobeexplainedintheupcomingsection.
![Page 247: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/247.jpg)
InterfacingChefBotsensorswithTivaCLaunchPadWehavediscussedinterfacingofindividualsensorsthatwearegoingtouseinChefBot.Inthissection,wewilldiscusshowtointegratesensorsintotheLaunchpadboard.TheEnergiacodetoprogramTivaCLaunchPadisavailableontheclonedfilesatGitHub.TheconnectiondiagramofTivaCLaunchPadwithsensorsisasfollows.Fromthisfigure,wegettoknowhowthesensorsareinterconnectedwithLaunchpad:
SensorinterfacingdiagramofChefBot
M1andM2aretwodifferentialdrivemotorsthatweareusinginthisrobot.ThemotorswearegoingtousehereisDCGearedmotorwithanencoderfromPololu.ThemotorterminalsareconnectedtotheVNH2SP30motordriverfromPololu.Oneofthemotorsisconnectedinreversepolaritybecauseindifferentialsteering,onemotorrotatesoppositetotheother.Ifwesendthesamecontrolsignaltoboththemotors,eachmotorwillrotateintheoppositedirection.Toavoidthiscondition,wewillconnectitinoppositepolarities.ThemotordriverisconnectedtoTivaCLaunchPadthrougha3.3V-5Vbidirectionallevelshifter.Oneofthelevelshifterwewillusehereisavailableat:https://www.sparkfun.com/products/12009.
![Page 248: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/248.jpg)
ThetwochannelsofeachencoderareconnectedtoLaunchpadviaalevelshifter.Currently,weareusingoneultrasonicdistancesensorforobstacledetection.Infuture,wecouldexpandthisnumber,ifrequired.Togetagoododometryestimate,wewillputIMUsensorMPU6050throughanI2Cinterface.ThepinsaredirectlyconnectedtoLaunchpadbecauseMPU6050is3.3Vcompatible.ToresetLaunchpadfromROSnodes,weareallocatingonepinastheoutputandconnectedtoresetpinofLaunchpad.WhenaspecificcharacterissenttoLaunchpad,itwillsettheoutputpintohighandresetthedevice.Insomesituations,theerrorfromthecalculationmayaccumulateanditcanaffectthenavigationoftherobot.WeareresettingLaunchpadtoclearthiserror.Tomonitorthebatterylevel,weareallocatinganotherpintoreadthebatteryvalue.ThisfeatureisnotcurrentlyimplementedintheEnergiacode.
ThecodeyoudownloadedfromGitHubconsistsofembeddedcode.Wecanseethemainsectionofthecodehereandthereisnoneedtoexplainallthesectionsbecausewealreadydiscussedit.
EmbeddedcodeforChefBotThemainsectionsoftheLaunchpadcodeisdiscussedhere.Thefollowingaretheheaderfilesusedinthecode:
//LibrarytocommunicatewithI2Cdevices
#include"Wire.h"
//I2CcommunicationlibraryforMPU6050
#include"I2Cdev.h"
//MPU6050interfacinglibrary
#include"MPU6050_6Axis_MotionApps20.h"
//Processingincomingserialdata
#include<Messenger.h>
//Containdefinitionofmaximumlimitsofvariousdatatype
#include<limits.h>
ThemainlibrariesusedinthiscodeareforthepurposeofcommunicatingwithMPU6050andprocesstheincomingserialdatatoLaunchpad.MPU6050canprovidetheorientationinquaternionorEulervaluesbyusingtheinbuiltDigitalMotionProcessor(DMP).ThefunctionstoaccessDMPiswritteninMPU6050_6Axis_MotionApps20.h.ThislibraryhasdependenciessuchasI2Cdev.handWire.h;that'swhyweareincludingtheseheadersaswell.ThesetwolibrariesareusedforI2Ccommunication.TheMessenger.hlibraryallowsyoutohandleastreamoftextdatafromanysourceandhelpstoextractthedatafromit.Thelimits.hheadercontainsdefinitionsofmaximumlimitsofvariousdatatypes.
Afterweincludetheheaderfiles,weneedtocreateanobjecttohandleMPU6050andprocesstheincomingserialdatausingtheMessengerclass:
//CreatingMPU6050Object
MPU6050accelgyro(0x68);
//Messengerobject
MessengerMessenger_Handler=Messenger();
Afterdeclaringthemessengerobject,themainsectionistoassignpinsforthemotordriver,encoder,ultrasonicsensor,MPU6050,reset,andbatterypins.Afterassigningthepins,wecanseethesetup()
![Page 249: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/249.jpg)
functionofthecode.Thedefinitionofthesetup()functionisgivenhere:
//Setupserial,encoders,ultrasonic,MPU6050andResetfunctions
voidsetup()
{
//InitSerialportwith115200baudrate
Serial.begin(115200);
//SetupEncoders
SetupEncoders();
//SetupMotors
SetupMotors();
//SetupUltrasonic
SetupUltrasonic();
//SetupMPU6050
Setup_MPU6050();
//SetupResetpins
SetupReset();
//SetupMessengerobjecthandler
Messenger_Handler.attach(OnMssageCompleted);
}
Theprecedingfunctioncontainscustomroutinetoconfigureandallocatepinsforallthesensors.Thisfunctionwillinitializeserialcommunicationwith115,200baudrateandsetpinsfortheencoder,motordriver,ultrasonic,andMPU6050.TheSetupReset()functionwillassignapintoresetthedevice,asshownintheconnectiondiagram.Wehavealreadyseenthesetuproutinesofeachsensorsinthepreviouschapters,sothereisnoneedtoexplainthedefinitionofeachfunctions.TheMessengerclasshandlerisattachedtoafunctioncalledOnMssageCompleted(),whichwillbecalledwhenadataisinputtotheMessenger_Handler.
Thefollowingisthemainloop()functionofthecode.Themainpurposeofthisfunctionistoreadandprocessserialdataandsendavailablesensorvaluesaswell:
voidloop()
{
//ReadfromSerialport
Read_From_Serial();
//Sendtimeinformationthroughserialport
Update_Time();
//Sendencodersvaluesthroughserialport
Update_Encoders();
//Sendultrasonicvaluesthroughserialport
Update_Ultra_Sonic();
//UpdatemotorspeedvalueswithcorrespondingspeedreceivedfromPCandsend
speedvaluesthroughserialport
Update_Motors();
//SendMPU6050valuesthroughserialport
Update_MPU6050();
//Sendbatteryvaluesthroughserialport
Update_Battery();
}
TheRead_From_Serial()functionwillreadserialdatafromthePCandfeeddatatotheMessenger_Handlerhandlerforprocessingpurpose.TheUpdate_Time()functionwillupdatethe
![Page 250: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/250.jpg)
timeaftereachoperationintheembeddedboard.WecantakethistimevaluetoprocessinthePCortakethePCtimeforprocessing.
WecancompilethecodeinEnergiaIDEandcanburnthecodeinLaunchpad.Afteruploadingthecode,wecandiscusstheROSnodestohandletheLaunchpadsensorvalues.
![Page 251: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/251.jpg)
WritingaROSPythondriverforChefBotAfteruploadingtheembeddedcodetoLaunchpad,thenextstepistohandletheserialdatafromLaunchpadandconvertittoROSTopicsforfurtherprocessing.Thelaunchpad_node.pyROSPythondrivernodeinterfacesTivaCLaunchPadtoROS.Thelaunchpad_node.pyfileisonthescriptfolder,whichisinsidethechefbot_bringuppackage.Thefollowingistheexplanationoflaunchpad_node.pyinimportantcodesections:
#ROSPythonclient
importrospy
importsys
importtime
importmath
#Thispythonmodulehelpstoreceivevaluesfromserialportwhichexecuteina
thread
fromSerialDataGatewayimportSerialDataGateway
#ImportingrequiredROSdatatypesforthecode
fromstd_msgs.msgimportInt16,Int32,Int64,Float32,String,Header,UInt64
#ImportingROSdatatypeforIMU
fromsensor_msgs.msgimportImu
Thelaunchpad_node.pyfileimportstheprecedingmodules.ThemainmoduleswecanseeisSerialDataGateway.ThisisacustommodulewrittentoreceiveserialdatafromtheLaunchpadboardinathread.WealsoneedsomedatatypesofROStohandlethesensordata.Themainfunctionofthenodeisgiveninthefollowingcodesnippet:
if__name__=='__main__':
rospy.init_node('launchpad_ros',anonymous=True)
launchpad=Launchpad_Class()
try:
launchpad.Start()
rospy.spin()
exceptrospy.ROSInterruptException:
rospy.logwarn("Errorinmainfunction")
launchpad.Reset_Launchpad()
launchpad.Stop()
ThemainclassofthisnodeiscalledLaunchpad_Class().Thisclasscontainsallthemethodstostart,stop,andconvertserialdatatoROSTopics.Inthemainfunction,wewillcreateanobjectofLaunchpad_Class().Aftercreatingtheobject,wewillcalltheStart()method,whichwillstarttheserialcommunicationbetweenTivaCLaunchPadandPC.IfweinterruptthedrivernodebypressingCtrl+C,itwillresettheLaunchpadandstoptheserialcommunicationbetweenthePCandLaunchpad.
ThefollowingcodesnippetisfromtheconstructorfunctionofLaunchpad_Class().Inthefollowingsnippet,wewillretrievetheportandbaudrateoftheLaunchpadboardfromROSparametersandinitializetheSerialDateGatewayobjectusingtheseparameters.TheSerialDataGatewayobject
![Page 252: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/252.jpg)
callsthe_HandleReceivedLine()functioninsidethisclasswhenanyincomingserialdataarrivesontheserialport.
Thisfunctionwillprocesseachlineofserialdataandextract,convert,andinsertittotheappropriateheadersofeachROSTopicdatatype:
#GetserialportandbaudrateofTivaCLaunchpad
port=rospy.get_param("~port","/dev/ttyACM0")
baudRate=int(rospy.get_param("~baudRate",115200))
#################################################################
rospy.loginfo("Startingwithserialport:"+port+",baudrate:"+
str(baudRate))
#InitializingSerialDataGatewayobjectwithserialport,baud
rateandcallbackfunctiontohandleincomingserialdata
self._SerialDataGateway=SerialDataGateway(port,baudRate,
self._HandleReceivedLine)
rospy.loginfo("Startedserialcommunication")
###################################################################Subscribersand
Publishers
#Publisherforleftandrightwheelencodervalues
self._Left_Encoder=rospy.Publisher('lwheel',Int64,queue_size=10)
self._Right_Encoder=rospy.Publisher('rwheel',Int64,queue_size=10)
#PublisherforBatterylevel(forupgradepurpose)
self._Battery_Level=rospy.Publisher('battery_level',Float32,queue_size=10)
#PublisherforUltrasonicdistancesensor
self._Ultrasonic_Value=rospy.Publisher('ultrasonic_distance',Float32,queue_size=
10)
#PublisherforIMUrotationquaternionvalues
self._qx_=rospy.Publisher('qx',Float32,queue_size=10)
self._qy_=rospy.Publisher('qy',Float32,queue_size=10)
self._qz_=rospy.Publisher('qz',Float32,queue_size=10)
self._qw_=rospy.Publisher('qw',Float32,queue_size=10)
#Publisherforentireserialdata
self._SerialPublisher=rospy.Publisher('serial',String,queue_size=10)
WewillcreatetheROSpublisherobjectforsensorssuchastheencoder,IMU,andultrasonicsensoraswellasfortheentireserialdatafordebuggingpurpose.Wewillalsosubscribethespeedcommandsfortheleft-handsideandtheright-handsidewheeloftherobot.WhenaspeedcommandarrivesonTopic,itcallstherespectivecallbackstosendspeedcommandstotherobot'sLaunchpad:
self._left_motor_speed=
rospy.Subscriber('left_wheel_speed',Float32,self._Update_Left_Speed)
self._right_motor_speed=
rospy.Subscriber('right_wheel_speed',Float32,self._Update_Right_Speed)
AftersettingtheChefBotdrivernode,weneedtointerfacetherobottoaROSnavigationstackin
![Page 253: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/253.jpg)
ordertoperformautonomousnavigation.Thebasicrequirementfordoingautonomousnavigationisthattherobotdrivernodes,receivevelocitycommandfromROSnavigationalstack.Therobotcanbecontrolledusingteleoperation.Inadditiontothesefeatures,therobotmustbeabletocomputeitspositionalorodometrydataandgeneratethetfdataforsendingintonavigationalstack.TheremustbeaPIDcontrollertocontroltherobotmotorvelocity.ThefollowingROSpackagehelpstoperformthesefunctions.Thedifferential_drivepackagecontainsnodestoperformtheprecedingoperation.Wearereusingthesenodesinourpackagetoimplementthesefunctionalities.Thefollowingisthelinkforthedifferential_drivepackageinROS:
http://wiki.ros.org/differential_drive
Thefollowingfigureshowshowthesenodescommunicatewitheachother.Wecanalsodiscusstheuseofothernodestoo:
Thepurposeofeachnodeinthechefbot_bringuppackageisasfollows:
twist_to_motors.py:ThisnodewillconverttheROSTwistcommandorlinearandangularvelocitytoindividualmotorvelocitytarget.Thetargetvelocitiesarepublishedatarateofthe~rateHertzandthepublishtimeout_tickstimesvelocityaftertheTwistmessagestops.ThefollowingaretheTopicsandparametersthatwillbepublishedandsubscribedbythisnode:
PublishingTopics:lwheel_vtarget(std_msgs/Float32):Thisisthethetargetvelocityoftheleftwheel(m/s).rwheel_vtarget(std_msgs/Float32):Thisisthetargetvelocityoftherightwheel(m/s).
![Page 254: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/254.jpg)
SubscribingTopics:Twist(geometry_msgs/Twist):ThisisthetargetTwistcommandfortherobot.ThelinearvelocityinthexdirectionandangularvelocitythetaoftheTwistmessagesareusedinthisrobot.
ImportantROSparameters:~base_width(float,default:0.1):Thisisthedistancebetweentherobot'stwowheelsinmeters.~rate(int,default:50):Thisistherateatwhichvelocitytargetispublished(Hertz).~timeout_ticks(int,default:2):ThisisthenumberofthevelocitytargetmessagepublishedafterstoppingtheTwistmessages.
pid_velocity.py:ThisisasimplePIDcontrollertocontrolthespeedofeachmotorsbytakingfeedbackfromwheelencoders.Inadifferentialdrivesystem,weneedonePIDcontrollerforeachwheel.Itwillreadtheencoderdatafromeachwheelsandcontrolthespeedofeachwheels.
PublishingTopics:motor_cmd(Float32):ThisisthefinaloutputofthePIDcontrollerthatgoestothemotor.WecanchangetherangeofthePIDoutputusingtheout_minandout_maxROSparameter.wheel_vel(Float32):Thisisthecurrentvelocityoftherobotwheelinm/s.
SubscribingTopics:wheel(Int16):ThisTopicistheoutputofarotaryencoder.ThereareindividualTopicsforeachencoderoftherobot.wheel_vtarget(Float32):Thisisthetargetvelocityinm/s.
Importantparameters:~Kp(float,default:10):ThisparameteristheproportionalgainofthePIDcontroller.~Ki(float,default:10):ThisparameteristheintegralgainofthePIDcontroller.~Kd(float,default:0.001):ThisparameteristhederivativegainofthePIDcontroller.~out_min(float,default:255):Thisistheminimumlimitofthevelocityvaluetomotor.Thisparameterlimitsthevelocityvaluetomotorcalledwheel_velTopic.~out_max(float,default:255):Thisisthemaximumlimitofwheel_velTopic(Hertz).~rate(float,default:20):Thisistherateofpublishingwheel_velTopic.ticks_meter(float,default:20):Thisisthenumberofwheelencodertickspermeter.Thisisaglobalparameterbecauseit'susedinothernodestoo.vel_threshold(float,default:0.001):Iftherobotvelocitydropsbelowthisparameter,weconsiderthewheelasstopped.Ifthevelocityofthewheelislessthanvel_threshold,weconsideritaszero.encoder_min(int,default:32768):Thisistheminimumvalueofencoderreading.encoder_max(int,default:32768):Thisisthemaximumvalueofencoderreading.wheel_low_wrap(int,default:0.3*(encoder_max-encoder_min)+encoder_min):Thesevaluesdecidewhethertheodometryisinnegativeorpositive
![Page 255: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/255.jpg)
direction.wheel_high_wrap(int,default:0.7*(encoder_max-encoder_min)+encoder_min):Thesevaluesdecidewhethertheodometryisinthenegativeorpositivedirection.
diff_tf.py:Thisnodecomputesthetransformationofodometryandbroadcastbetweentheodometryframeandtherobotbaseframe.
PublishingTopics:odom(nav_msgs/odometry):Thispublishestheodometry(currentposeandtwistoftherobot.tf:Thisprovidestransformationbetweentheodometryframeandtherobotbaselink.
SubscribingTopics:lwheel(std_msgs/Int16),rwheel(std_msgs/Int16):Thesearetheoutputvaluesfromtheleftandrightencoderoftherobot.
chefbot_keyboard_teleop.py:ThisnodesendstheTwistcommandusingcontrolsfromthekeyboard.
PublishingTopics:cmd_vel_mux/input/teleop(geometry_msgs/Twist):Thispublishesthetwistmessagesusingkeyboardcommands.
Afterdiscussingnodesinthechefbot_bringuppackage,wewilllookatthefunctionsoflaunchfiles.
![Page 256: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/256.jpg)
UnderstandingChefBotROSlaunchfilesWewilldiscussthefunctionsofeachlaunchfilesofthechefbot_bringuppackage.
robot_standalone.launch:Themainfunctionofthislaunchfileistostartnodessuchaslaunchpad_node,pid_velocity,diff_tf,andtwist_to_motortogetsensorvaluesfromtherobotandtosendcommandvelocitytotherobot.keyboard_teleop.launch:Thislaunchfilewillstarttheteleoperationbyusingthekeyboard.Thislaunchstartsthechefbot_keyboard_teleop.pynodetoperformthekeyboardteleoperation.3dsensor.launch:ThisfilewilllaunchKinectOpenNIdriversandstartpublishingRGBanddepthstream.Itwillalsostartthedepthstreamtolaserscannernode,whichwillconvertpointcloudtolaserscandata.gmapping_demo.launch:ThislaunchfilewillstartSLAMgmappingnodestomaptheareasurroundingtherobot.amcl_demo.launch:UsingAMCL,therobotcanlocalizeandpredictwhereitstandsonthemap.Afterlocalizingonthemap,wecancommandtherobottomovetoapositiononthemap,thentherobotcanmoveautonomouslyfromitscurrentpositiontothegoalposition.view_robot.launch:ThislaunchfiledisplaystherobotURDFmodelinRViz.view_navigation.launch:Thislaunchfiledisplaysallthesensorsnecessaryforthenavigationoftherobot.
![Page 257: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/257.jpg)
WorkingwithChefBotPythonnodesandlaunchfilesWealreadysetChefBotROSpackagesinIntel'sNUCPCanduploadedtheembeddedcodetotheLaunchpadboard.ThenextstepistoputtheNUCPContherobot,configureremoteconnectionfromthelaptoptotherobot,testingeachnodes,andworkingwithChefBotLaunchfilestoperformautonomousnavigation.
ThemaindeviceweshouldhavebeforeworkingwithChefBotisagoodwirelessrouter.Therobotandtheremotelaptophavetoconnectonthesamenetwork.IftherobotPCandremotelaptopareonthesamenetwork,theusercanconnectfromtheremotelaptoptotherobotPCthroughSSHusingitsIP.BeforeputtingtherobotPCintherobot,weshouldconnecttherobotPCinthewirelessnetwork,soonceit'sconnectedtothewirelessnetwork,itwillremembertheconnectiondetails.Whentherobotpowers,thePCshouldautomaticallyconnecttothewirelessnetwork.OncetherobotPCisconnectedtothewirelessnetwork,wecanputitintheactualrobot.ThefollowingfigureshowstheconnectiondiagramoftherobotandremotePC:
WirelessconnectiondiagramoftherobotandremotePC
TheprecedingfigureassumesthattheChefBotIPis192.168.1.106andtheremotePCIPis192.168.1.101.
WecanremotelyaccesstheChefBotterminalusingSSH.WecanusethefollowingcommandtologintoChefBot,whererobotistheusernameofChefBotPC:
WhenyoulogintoChefBotPC,itwillaskfortherobotPCpassword.AfterenteringthepasswordoftherobotPC,wecanaccesstherobotPCterminal.AfterloggingintotherobotPC,wecanstarttestingtheROSnodesofChefBotandtestwhetherwecanreceivetheserialvaluesfromtheLaunchpadboardinsideChefBot.NotethatyoushouldloginagaintoChefBotPCthroughSSHifyouareusinganewterminal.
![Page 258: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/258.jpg)
IftheChefbot_bringuppackageisproperlyinstalledonthePCandiftheLaunchpadboardisconnected,thenbeforerunningtheROSdrivernode,wecanruntheminiterm.pytooltocheckwhethertheserialvaluescomeproperlytothePCviaUSB.Wecanfindtheserialdevicenameusingthedmesgcommand.Wecanrunminiterm.pyusingthefollowingcommand:
$miniterm.py/dev/ttyACM0115200
Ifitshowsthepermissiondeniedmessage,setthepermissionoftheUSBdevicebywritingrulesontheudevfolderaswedidinChapter5,WorkingwithRoboticActuatorsandWheelEncodersofthisbookorwecantemporarilychangethepermissionusingthefollowingcommand.WeareassumingttyACM0isthedevicenameofLaunchpad.IfthedevicenameisdifferentinyourPC,thenyouhavetousethatnameinsteadofttyACM0:
$sudochmod777/dev/ttyACM0
Ifeverythingworksfine,wewillgetvalueslikethoseshowninfollowingscreenshot:
Theletterbisusedtoindicatethebatteryreadingoftherobot;currently,it'snotimplemented.Thevalueissettozeronow.Thelettertindicatesthetotaltimeinmicrosecondsaftertherobotstartsrunningtheembeddedcode.Thesecondvalueistimeinseconds;it'sthetimetakentocompleteoneentireoperationinLaunchpad.Wecanusethisvalueifweareperformingreal-timecalculationsoftheparametersoftherobot.Currently,wearenotusingthisvalue;wemayusethisinthefuture.Thelettereindicatesvaluesoftheleftandrightencoderrespectively.Boththevaluesarezeroherebecausetherobotisnotmoving.Theletteruindicatesthevaluesfromtheultrasonicdistancesensor.Thedistancevaluewegetisincentimeters.Thelettersindicatesthecurrentrobotwheelspeedofthe
![Page 259: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/259.jpg)
robot.Thisvalueisforinspectionpurpose.Actually,speedisacontroloutputfromthePCitself.
ToconverttheseserialdatatoROSTopics,wehavetorunthedrivenodecalledlaunchpad_node.py.Thefollowingcodeshowshowtoexecutethisnode.
First,wehavetorunroscorebeforestartinganynodes:
$roscore
Runlaunchpad_node.pyusingthefollowingcommand:
$rosrunchefbot_bringuplaunchpad_node.py
Ifeverythingworksfine,wewillgetthefollowingoutputinthenoderunningterminal:
Afterrunninglaunchpad_node.py,wewillseethefollowingTopicsgenerated,asgiveninthefollowingscreenshot:
Wecanviewtheserialdatareceivedbydrivernodebysubscribing/serialTopic.Wecanuseitfordebugpurposes.IftheserialTopicshowsthesamedataaswesawonminiterm.py,thenwecanconfirmthatthenodesareworkingfine.Thefollowingscreenshotistheoutputof/serialTopic:
![Page 260: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/260.jpg)
Aftersettingthechefbot_bringuppackage,wecanstartworkingwiththeautonomousnavigationofChefBot.Currently,weareaccessingonlytheChefBotPC'sterminal.Tovisualizetherobotmodel,sensordata,maps,andsoon,wehavetouseRVizintheuser'sPC.WehavetodosomeconfigurationintherobotanduserPCtoperformthisoperation.Itshouldbenotedthattheuser'sPCshouldhavethesamesoftwaresetupasintheChefBotPC.
Thefirstthingwehavetodois,settheChefBotPCasaROSmaster.WecansettheChefBotPCastheROSmasterbysettingtheROS_MASTER_URIvalue.ROS_MASTER_URIisarequiredsetting,ItinformsthenodesabouttheUniformResourceIdentifier(URI)oftheROSmaster.WhenyousetthesameROS_MASTER_URIfortheChefBotPCandtheremotePC,wecanaccesstheTopicsoftheChefBotPCintheremotePC.So,ifwerunRVizlocally,thenitwillvisualizetheTopicsgeneratedintheChefBotPC.
AssumethattheChefBotPCIPis192.168.1.106andtheremotePCIPis192.168.1.101.TosetROS_MASTER_URIineachsystem,thefollowingcommandshouldincludethe.bashrcfileinthehomefolder.Thefollowingfigureshowsthesetupneededtoinclude.bashrcineachsystem:
![Page 261: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/261.jpg)
Addtheselinesatthebottomof.bashrconeachPCandchangetheIPaddressaccordingtoyournetwork.
Afterweperformthesesettings,wecanjuststartroscoreontheChefBotPCterminalandexecutecommandrostopiclistontheremotePC.
IfyouseeanyTopics,youaredonewiththesettings.Wecanfirstruntherobotusingthekeyboardteleoperationtochecktherobot'sfunctioningandconfirmwhetherwegetthesensorvalues.
Wecanstarttherobotdriverandothernodesusingthefollowingcommand.NotethatthisshouldexecuteintheChefBotterminalafterlogin,usingSSH:
$roslaunchchefbot_bringuprobot_standalone.launch
Afterlaunchingtherobotdriverandnodes,startthekeyboardteleoperationusingthefollowingcommand.ThisalsohastobedoneonthenewterminaloftheChefBotPC:
$roslaunchchefbot_bringupkeyboard_teleop.launch
ToactivateKinect,executethefollowingcommand.ThiscommandisalsoexecutedontheChefBotterminal:
$roslaunchchefbot_bringup3dsensor.launch
Toviewsensordata,wecanexecutethefollowingcommand.ThiswillviewtherobotmodelinRVizandshouldbeexecutedintheremotePC.Ifwesetupthechefbot_bringuppackageintheremotePC,wecanaccessthefollowingcommandandvisualizetherobotmodelandsensordatafromChefBotPC:
$roslaunchchefbot_bringupview_robot.launch
ThefollowingscreenshotistheoutputofRViz.WecanseeLaserScanandPointCloudinthescreenshots:
![Page 262: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/262.jpg)
ChefBotLaserScandatainRViz
TheprecedingscreenshotshowsLaserScaninRViz.WeneedtotickLaserScanTopicfromtheleft-handsidesectionofRViztoenablethelaserscandata.Thelaserscandataismarkedontheviewport.IfyouwanttowatchthePointClouddatafromKinect,clickontheAddbuttonontheleft-handsideofRVizandselectPointCloud2fromthepopupwindow.
SelectTopic/camera/depth_registeredfromthelistandyouwillseethefollowingscreenshot:
![Page 263: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/263.jpg)
ChefBotwithPointClouddata
Afterworkingwithsensors,wecanperformSLAMtomaptheroom.ThefollowingprocedurehelpstostartSLAMonthisrobot.
WorkingwithSLAMonROStobuildthemapoftheroomToperformgmapping,wehavetoexecutethefollowingcommands:
StartingtherobotdriverintheChefBotterminal:
$roslaunchchefbot_bringuprobot_standalone.launch
Executethefollowingcommandtostartthegmappingprocess.NotethatitshouldbeexecutedontheChefBotterminal:
$roslaunchchefbot_bringupgmapping_demo.launch
Gmappingwillonlyworkiftheodometryvaluereceivedisproper.Iftheodometryvalueisreceived
![Page 264: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/264.jpg)
fromtherobot,wewillreceivethefollowingmessagefortheprecedingcommand.Ifwegetthismessage,wecanconfirmthatgmappingwillworkfine:
Tostartthekeyboardteleoperation,usethefollowingcommand:
$roslaunchchefbot_bringupkeyboard_teleop.launch
Toviewthemapbeingcreated,weneedtostartRVizintheremotesystemusingthefollowingcommand:
$roslaunchchefbot_bringupview_navigation.launch
AfterviewingtherobotinRViz,youcanmovetherobotusingthekeyboardandseethemapbeingcreated.Whenitmapstheentirearea,wecansavethemapusingthefollowingcommandontheChefBotPCterminal:
$rosrunmap_servermap_saver-f~/test_map
wheretest_mapisthenameofthemapbeingstoredonthehomefolder.Thefollowingscreenshotshowsthemapofaroomcreatedbytherobot:
![Page 265: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/265.jpg)
Mappingaroom
Afterthemapisstored,wecanworkwiththelocalizationandautonomousnavigationusingROS.
WorkingwithROSlocalizationandnavigationAfterbuildingthemap,closealltheapplicationsandreruntherobotdriverusingthefollowingcommand:
$roslaunchchefbot_bringuprobot_standalone.launch
Startlocalizationandnavigationonthestoredmapusingthefollowingcommand:
$roslaunchchefbot_bringupamcl_demo.launchmap_file:=~/test_map.yaml
StartviewingtherobotusingthefollowingcommandintheremotePC:
![Page 266: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/266.jpg)
$roslaunchchefbot_bringupview_navigation.launch
InRViz,wemayneedtospecifytheinitialposeoftherobotusingthe2DPoseEstimatebutton.Wecanchangetherobotposeonthemapusingthisbutton.Iftherobotisabletolocatethemap,thenwecanusethe2DNavGoalbuttontocommandtherobottomovetothedesiredposition.Whenwestartlocalization,wecanseetheparticlecloudbytheAMCLalgorithmaroundtherobot:
Thefollowingisascreenshotoftherobotthatnavigatesautonomouslyfromthecurrentpositiontothegoalposition.Thegoalpositionismarkedasablackdot:
![Page 267: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/267.jpg)
Theblacklinefromtherobottotheblackdotistherobot'splannedpathtoreachthegoalposition.Iftherobotisnotabletolocatethemap,wemightneedtofine-tunetheparameterfilesintheChefbot_bringupparamfolder.Formorefinetuningdetails,youcangothroughtheAMCLpackageonROS.Youcanvisitthefollowinglink:
http://wiki.ros.org/amcl?distro=indigo
![Page 268: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/268.jpg)
Questions1. WhatisuseoftherobotROSdrivernode?2. WhatistheroleofthePIDcontrollerinnavigation?3. Howtoconverttheencoderdatatoodometrydata?4. WhatisroleofSLAMinrobotnavigation?5. WhatisroleofAMCLinrobotnavigation?
![Page 269: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/269.jpg)
SummaryThischapterwasaboutassemblingthehardwareofChefBotandintegratingtheembeddedandROScodeintotherobottoperformautonomousnavigation.WesawtherobothardwarepartsthatweremanufacturedusingthedesignfromChapter5,WorkingwithRoboticActuatorsandWheelEncoders.WeassembledindividualsectionsoftherobotandconnectedtheprototypePCBthatwedesignedfortherobot.ThisconsistsoftheLaunchpadboard,motordriver,leftshifter,ultrasonic,andIMU.TheLaunchpadboardwasflashedwiththenewembeddedcode,whichcaninterfaceallsensorsintherobotandcansendorreceivedatafromthePC.Afterdiscussingtheembeddedcode,wewrotetheROSPythondrivernodetointerfacetheserialdatafromtheLaunchpadboard.AfterinterfacingtheLaunchpadboard,wecomputedtheodometrydataanddifferentialdrivecontrollingusingnodesfromthedifferential_drivepackagethatexistedintheROSrepository.WeinterfacedtherobottoROSnavigationstack.ThisenablestoperformSLAMandAMCLforautonomousnavigation.WealsodiscussedSLAM,AMCL,createdmap,andexecutedautonomousnavigationontherobot.
![Page 270: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/270.jpg)
Chapter11.DesigningaGUIforaRobotUsingQtandPythonInthelastchapter,wediscussedtheintegrationofrobotichardwarecomponentsandsoftwarepackagesforperformingautonomousnavigation.Aftertheintegration,thenextstepistobuildaGUItocontroltherobot.WearebuildingaGUIthatcanactasatriggerfortheunderlyingROScommands.Insteadofrunningallthecommandsontheterminal,theusercanworkwiththeGUIbuttons.TheGUIwearegoingtodesignisforatypicalhotelroomwithninetables.Theusercansetatablepositioninthemapofthehotelroomandcommandtherobottogotoaparticulartabletodeliverfood.Afterdeliveringthefood,theusercancommandtherobottogotoitshomeposition.
SomeofthemostpopularGUIframeworkscurrentlyavailableareQt(http://qt.digia.com)andGTK+(http://www.gtk.org/),QtandGTK+areopensource,cross-platformuserinterfacetoolkitsanddevelopmentplatforms.ThesetwosoftwareframeworksarewidelyusedinLinuxdesktopenvironments,likeGNOMEandKDE.
Inthischapter,wewillbeusingPythonbindingoftheQtframeworktoimplementtheGUIbecausePythonbindingofQtismorestablethanotherUIPythonbindings.WecanseehowtodevelopaGUIfromscratchandprogramitusingPython.AfterdiscussingbasicPythonandQtprogramming,wewilldiscussROSinterfacesofQt,whicharealreadyavailableinROS.WewillfirstlookatwhatistheQtUIframeworkisandhowtoinstallitonourPC.
InstallingQtonUbuntu14.04.2LTSQtisacross-platformapplicationframeworkthatiswidelyusedtodevelopapplicationsoftwarewithaGUIinterfaceaswellascommandlinetools.Qtisavailableonalmostalloperatingsystems,likeWindows,MacOSX,Android,andsoon.ThemainprogramminglanguageusedfordevelopingQtapplicationsisC++buttherearebindingsavailableforlanguagessuchasPython,Ruby,Java,andsoon.Let'stakealookathowtoinstallQtSDKonUbuntu14.04.2.WewillinstallQtfromtheAdvancePackagingTool(APT)inUbuntu.TheAPTalreadycomeswiththeUbuntuinstallation.SoforinstallingQt/QtSDK,wecansimplyusethefollowingcommand,whichwillinstallQtSDKanditsrequireddependenciesfromtheUbuntupackagerepository.WecaninstallQtversion4usingthefollowingcommand:
$sudoapt-getinstallqt-sdk
ThiscommandwillinstalltheentireQtSDKanditslibrariesrequiredforourproject.ThepackagesavailableonUbunturepositoriesmaynotbethelatestversions.TogetthelatestversionofQt,wecandownloadtheonlineorofflineinstallerofQtforvariousOSplatformsfromthefollowinglink:
http://qt-project.org/downloads
AfterinstallingQtonoursystem,wecanseehowwecandevelopaGUIusingQtandinterfacewithPython.
![Page 271: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/271.jpg)
WorkingwithPythonbindingsofQtLet'sseehowwecaninterfacePythonandQt.Ingeneral,therearetwomodulesavailableinPythonforconnectingtotheQtuserinterface.Thetwomostpopularframeworksare:
PyQtPySide
PyQtPyQtisoneofthepopularmostPythonbindingsforQtcross-platform.PyQtisdevelopedandmaintainedbyRiverbankComputingLimited.ItprovidesbindingforQtversion4andQtversion5,andcomeswithGPL(version2or3)alongwithacommerciallicense.PyQtisavailableforQtversion4and5,calledPyQt4andPyQt5,respectively.ThesetwomodulesarecompatiblewithPythonversions2and3.PyQtcontainsmorethan620classesthatcoveruserinterface,XML,networkcommunication,web,andsoon.
PyQtisavailableonWindows,Linux,andMacOSX.ItisaprerequisitetoinstallQtSDKandPythoninordertoinstallPyQt.ThebinariesforWindowsandMacOSXareavailableonthefollowinglink:
http://www.riverbankcomputing.com/software/pyqt/download
WecanseehowtoinstallPyQt4onUbuntu14.04.2usingPython2.7.
InstallingPyQtonUbuntu14.04.2LTS
IfyouwanttoinstallPyQtonUbuntu/Linux,usethefollowingcommand.ThiscommandwillinstallthePyQtlibrary,itsdependencies,andsomeQttools:
$sudoapt-getinstallpython-qt4pyqt4-dev-tools
PySidePySideisanopensourcesoftwareprojectthatprovidesPythonbindingfortheQtframework.ThePySideprojectwasinitiatedbyNokia,andoffersafullsetofQtbindingformultipleplatforms.ThetechniqueusedinPySidetowraptheQtlibraryisdifferentfromPyQt,buttheAPIofbothissimilar.PySideiscurrentlynotsupportedonQt5.PySideisavailableforWindows,Linux,andMacOSX.ThefollowinglinkwillguideyoutosetupPySideonWindowsandMacOSX:
http://qt-project.org/wiki/Category:LanguageBindings::PySide::Downloads
TheprerequisitesofPySidearethesameasPyQt.Let'sseehowwecaninstallPySideonUbuntu14.04.2LTS.
InstallingPySideonUbuntu14.04.2LTS
ThePySidepackageisavailableontheUbuntupackagerepository.ThefollowingcommandwillinstallthePySidemoduleandQttoolsonUbuntu:
![Page 272: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/272.jpg)
$sudoapt-getinstallpython-pysidepyside-tools
Let'sworkwithbothmodulesandseethedifferencesbetweenboth.
![Page 273: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/273.jpg)
WorkingwithPyQtandPySideAfterinstallingthePyQtandPySidepackages,wecanseehowtowriteaHelloWorldGUIusingPyQtandPySide.ThemaindifferencebetweenPyQtandPySideisonlyinsomecommands;mostofthestepsarethesame.Let'sseehowtomakeaQtGUIandconvertitintoPythoncode.
IntroducingQtDesignerQtDesigneristhetoolfordesigningandinsertingcontrolintoQtGUI.QtGUIisbasicallyanXMLfilethatcontainstheinformationofitscomponentsandcontrols.ThefirststeptoworkwithGUIisitsdesigning.TheQtDesignertoolprovidesvariousoptionstomakeexcellentGUIs.
StartQtDesignerbyenteringthecommanddesigner-qt4intheterminal.Thefollowingimageshowswhatyouwillbeabletoseeafterrunningthiscommand:
TheprecedingimageshowstheQtdesignerinterface.SelecttheWidgetoptionfromtheNewFormwindowandclickontheCreatebutton.Thiswillcreateanemptywidget;wecandragvariousGUIcontrolsfromtheleft-handsideofQt4designertotheemptywidget.QtwidgetsarethebasicbuildingblocksofQtGUI.
ThefollowingimageshowsaformwithaPushButtondraggedfromtheleft-handsidewindowofQtDesigner:
![Page 274: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/274.jpg)
TheHelloWorldapplicationthatwearegoingtobuildhasaPushButton,whenweclickonthePushButton,aHelloWorldmessagewillbeprintedontheterminal.BeforebuildingtheHelloWorldapplication,weneedtounderstandwhatQtsignalsandslotsare,becausewehavetousethesefeaturesforbuildingtheHelloWorldapplication.
QtsignalsandslotsInQt,GUIeventsarehandledusingthesignalsandslotsfeatures.AsignalisemittedfromtheGUIwhenaneventoccurs.QtWidgetshavemanypredefinedsignals,anduserscanaddcustomsignalsforGUIevents.Aslotisafunctionthatiscalledinresponsetoaparticularsignal.Inthisexample,weareusingtheclicked()signalofPushButtonandcreatingacustomslotforthissignal.Wecanwriteourowncodeonthiscustomfunction.Let'sseehowwecancreateabutton,connectasignaltoaslot,andconverttheentireGUItoPython.
HerearethestepsinvolvedincreatingtheHelloWorldGUIapplication:
1. DragandcreateaPushButtonfromQtDesignertotheemptyForm.2. Assignaslotforthebuttonclickedevent,whichemitsasignalcalledclicked().3. SavethedesignedUIfileinthe.uiextension.4. ConvertUIfilestoPython.5. Writethedefinitionofthecustomslot.6. PrinttheHelloWorldmessageinsidethedefinedslot/function.
![Page 275: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/275.jpg)
WehavealreadydraggedabuttonfromQtDesignertoanemptyForm.PresstheF4keytoinsertaslotonthebutton.WhenwepressF4,thePushButtonturnsred,andwecandragalinefromthebuttonandplacethegroundsymbol( )inthemainwindow.Thisisshowninthefollowingscreenshot:
Selecttheclicked()signalfromtheleft-handsideandclickontheEdit..buttontocreateanewcustomslot.WhenweclickontheEdit..button,anotherwindowwillpopuptocreateacustomfunction.Youcancreateacustomfunctionbyclickingonthe+symbol.
Wecreatedacustomslotcalledmessage(),asshowninthescreenshotbelow:
![Page 276: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/276.jpg)
ClickontheOKbuttonandsavetheUIfileashello_world.ui,andquittheQtdesigner.AftersavingtheUIfile,let'sseehowwecanconvertaQtUIfileintoaPythonfile.
ConvertingaUIfileintoPythoncodeAfterdesigningtheUIfile,wecanconverttheUIfileintoitsequivalentPythoncode.Theconversionisdoneusingapyuiccompiler.WehavealreadyinstalledthistoolwhileinstallingPyQt/PySide.ThefollowingarethecommandstoconvertaQtUIfileintoaPythonfile.
WehavetousedifferentcommandsforPyQtandPySide.ThefollowingcommandistoconvertUIintoitsPyQtequivalentfile:
$pyuic4-xhello_world.ui-ohello_world.py
Thepyuic4isaUIcompilertoconvertaUIfileintoitsequivalentPythoncode.WeneedtomentiontheUIfilenameafterthe-xargumentandmentiontheoutputfilenameafterthe-oargument.
TherearenotmanychangesforthePySidecommand,insteadofpyuic4,PySideusespyside-uictoconvertUIfilesintoPythonfiles.Theremainingargumentsarethesame:
$pyside-uic-xhello_world.ui-ohello_world.py
TheprecedingcommandwillgenerateanequivalentPythoncodefortheUIfile.IfwerunthisPythoncode,theUIdesignedinQtDesignerwillpopup.Thegeneratedscriptwillnothavethedefinitionofthecustomfunctionmessage().Weshouldaddthiscustomfunctiontogeneratethecode.The
![Page 277: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/277.jpg)
followingprocedurewillguideyouthroughaddingthecustomfunction;sowhenyouclickonthebutton,thecustomfunctionmessage()willbeexecuted.
AddingaslotdefinitiontoPyQtcodeThegeneratedPythoncodefromPyQtisgivenhere.Thecodegeneratedbypyuic4andpyside-uicarethesame,exceptinimportingmodulenames.Allotherpartsarethesame.TheexplanationofthecodegeneratedusingPyQtisalsoapplicabletoPySidecode.Thecodegeneratedfromtheaboveconversionisasfollows.ThecodestructureandparameterscanchangeaccordingtotheUIfilethatyouhavedesigned:
fromPyQt4importQtCore,QtGui
try:
_fromUtf8=QtCore.QString.fromUtf8
exceptAttributeError:
_fromUtf8=lambdas:s
classUi_Form(object):
defsetupUi(self,Form):
Form.setObjectName(_fromUtf8("Form"))
Form.resize(514,355)
self.pushButton=QtGui.QPushButton(Form)
self.pushButton.setGeometry(QtCore.QRect(150,80,191,61))
self.pushButton.setObjectName(_fromUtf8("pushButton"))
self.retranslateUi(Form)
QtCore.QObject.connect(self.pushButton,
QtCore.SIGNAL(_fromUtf8("clicked()")),Form.message)
QtCore.QMetaObject.connectSlotsByName(Form)
defretranslateUi(self,Form):
Form.setWindowTitle(QtGui.QApplication.translate("Form","Form",None,
QtGui.QApplication.UnicodeUTF8))
self.pushButton.setText(QtGui.QApplication.translate("Form","Press",None,
QtGui.QApplication.UnicodeUTF8))
if__name__=="__main__":
importsys
app=QtGui.QApplication(sys.argv)
Form=QtGui.QWidget()
ui=Ui_Form()
ui.setupUi(Form)
Form.show()
sys.exit(app.exec_())
TheprecedingcodeistheequivalentPythonscriptoftheQtUIfilethatwedesignedintheQtdesignerapplication.Hereisthestep-by-stepprocedureoftheworkingofthiscode:
1. Thecodewillstartexecutingfromif__name__=="__main__":.ThefirstthinginaPyQt
![Page 278: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/278.jpg)
codeistocreateaQApplicationobject.AQApplicationclassmanagestheGUIapplication'scontrolflowandmainsettings.TheQApplicationclasscontainsthemaineventloop,wherealleventsfromtheWindowssystemandothersourcesareprocessedanddispatched.Italsohandlesinitializationandfinalizationofanapplication.TheQApplicationclassisinsidetheQtGuimodule.ThiscodecreatesanobjectofQApplicationcalledapp.
2. TheForm=QtGui.QWidget()linecreatesanobjectcalledFormfromtheQWidgetclassthatispresentinsidetheQtGuimodule.TheQWidgetclassisthebaseclassofalltheuserinterfaceobjectsofQt.ItcanreceivethemouseandkeyboardeventfromthemainWindowssystem.
3. Theui=Ui_Form()linecreatesanobjectcalleduifromtheUi_Form()classdefinedinthecode.TheUi_Form()objectcanaccepttheQWidgetclassthatwecreatedinthepreviouslineanditcanaddbuttons,text,buttoncontrol,andotherUIcomponentsintothisQWidgetobject.TheUi_Form()classcontainstwofunctions:setupUi()andretranslateUi().WecanpasstheQWidgetobjecttothefunctioncalledsetupUi().ThisfunctionwilladdUIcomponentsonthiswidgetobjectsuchasbuttons,assigningslotsforsignals,andsoon.TheretranslateUi()functionwilltranslatethelanguageoftheUItootherlanguagesifneeded,forexample,ifweneedtranslationfromEnglishtoSpanish,wecanmentionthecorrespondingSpanishwordinthisfunction.
4. TheForm.show()linedisplaysthefinalwindowwithbuttonsandtext.
Thenextthingistocreatetheslotfunction,whichprintstheHelloWorldmessage.TheslotdefinitioniscreatedinsidetheUi_Form()class.Thefollowingstepsinserttheslotcalledmessage()intotheUi_Form()class.
Themessage()functiondefinitionisasfollows:
defmessage(self):
print"HelloWorld"
ThisshouldbeinsertedasafunctioninsidetheUi_Form()class.Also,changethefollowinglineinthesetupUi()functioninsidetheUi_Form()class:
QtCore.QObject.connect(self.pushButton,QtCore.SIGNAL(_fromUtf8("clicked()")),
Form.message)
TheForm.messageparametershouldbereplacedwiththeself.messageparameter.TheprecedinglineconnectsthePushBbuttonsignalclicked()totheself.message()slotthatwealreadyinsertedintheUi_Form()class.
UpandrunningofHelloWorldGUIapplicationAfterreplacingtheForm.messageparameterwiththeself.messageparameter,wecanexecutethecodeandtheoutputwilllooklikethis:
![Page 279: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/279.jpg)
WhenweclickonthePressbutton,itwillprinttheHelloworldmessage.ThisisallaboutsettingacustomGUIwithPythonandQt.
Inthenextsection,wewillseetheactualGUIthatwearedesigningfortherobot.
![Page 280: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/280.jpg)
WorkingwithChefBot'scontrolGUIAftercompletingtheHelloWorldapplicationinPyQt,nextwecandiscussaGUIforcontrollingChefBot.ThemainuseofbuildingaGUIistocreateaneasierwaytocontroltherobot,forexample,iftherobotisdeployedinahoteltoservefood,thepersonwhocontrolsthisrobotneednothaveknowledgeaboutthecomplexcommandstostartandstopthisrobot;sobuildingaGUIforChefBotcanreducethecomplexityandmakeiteasierfortheuser.WeareplanningtobuildaGUIusingPyQt,ROSandPythoninterface.TheChefBotROSpackageisavailableonGitHubonthefollowinglink:
https://github.com/qboticslabs/Chefbot_ROS_pkg.git
Ifyouhaven'tclonedthecodeyet,youcandoitnowusingfollowingcommand:
$gitclonehttps://github.com/qboticslabs/Chefbot_ROS_pkg.git
TheGUIcodenamedrobot_gui.pyisplacedinthescriptsfolder,whichisinsidethechefbot_bringuppackage.
ThefollowingscreenshotshowstheGUIthatwehavedesignedforChefBot:
TheGUIhasthefollowingfeatures:
Itcanmonitorrobotbatterystatusandrobotstatus.Robotstatusindicatestheworkingstatusoftherobot,forexample,iftherobotencountersanerror,itwillindicatetheerroronthisGUI.Itcancommandtherobottomoveintoatablepositionfordeliveringfood.ThereisaspinboxwidgetontheGUItoinputthetableposition.Currently,weareplanningthisGUIforaninetableroom,butwemayexpanditintoanynumberaccordingtorequirement.Afterinputtingthetablenumber,wecancommandtherobottogotothattablebyclickingontheGobutton;therobot
![Page 281: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/281.jpg)
willgetintothatposition.Ifwewanttoreturntherobottotheinitialposition,wecanclickontheHomebutton.Ifwewanttocancelthecurrentrobotmovement,clickonCanceltostoptherobot.TheworkingofthisGUIapplicationisasfollows:
WhenwehavetodeployChefBotinahotel,thefirstprocedurethatwehavetodoistocreateamapoftheroom.Aftermappingtheentireroomproperly,wehavetosavethemapontherobot'sPC.Therobotdoesthemappingonlyonce,aftermappingwecanrunthelocalizationandnavigationroutines,andcommandtherobottogetintoapositiononthemap.TheChefBotROSpackagecomeswithamapandsimulationmodelofahotel-likeenvironment.WecanrunthissimulationandlocalizationnowfortestingtheGUIandinthenextchapter,wecandiscusshowtocontrolthehardwareusingtheGUI.IfyouinstallChefBotROSpackagesonyourlocalsystem,wecansimulateahotelenvironmentandtesttheGUI.
StarttheChefBotsimulationinahotel-likearrangementusingthefollowingcommand:
$roslaunchchefbot_gazebochefbot_hotel_world.launch
AfterstartingtheChefBotsimulation,wecanrunthelocalizationandnavigationroutinesusinganalreadybuiltmap.Themapisplacedonthechefbot_bringuppackage.Wecanseeamapfolderinsidethispackage.Here,wewillusethismapforperformingthistest.Wecanloadthelocalizationandnavigationroutineusingthefollowingcommand:
$roslaunchchefbot_gazeboamcl_demo.launch
map_file:=/home/lentin/catkin_ws/src/chefbot/chefbot_bringup/map/hotel1.yaml
Thepathofthemapfilecanchangeinadifferentsystem,sousethepathinyoursysteminsteadofthispath.
Ifthepathmentionediscorrect,itwillstartrunningtheROSnavigationstack.Ifwewanttoseetherobotpositiononthemapormanuallysettheinitialpositionofrobot,wecanuseRVizusingthefollowingcommand:
$roslaunchchefbot_bringupview_navigation.launch
InRViz,wecancommandtherobottogotoanymapcoordinatesusingthe2DNavGoalbutton.
Wecancommandtherobottogotoanymapcoordinatesusingprogrammingtoo.TheROSnavigationstackisworkingusingtheROSactionliblibrary.TheROSactionliblibraryisforperformingpreemptabletasks,itissimilartoROSServices.AnadvantageoverROSservicesisthatwecancanceltherequestifwedon'twantitatthattime.
IntheGUI,wecancommandtherobottogotoamapcoordinateusingPythonactionliblibrary.Wecangetthetablepositiononthemapusingthefollowingtechnique.
AfterstartingthesimulatorandAMCLnodes,launchthekeyboardteleoperationandmovetherobotneareachtable.Usethefollowingcommandtogetthetranslationandrotationoftherobot:
$rosruntftf_echo/map/base_link
![Page 282: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/282.jpg)
WhenweclickontheGobutton,thatpositionisfedtothenavigationstackandtherobotplansitspathandreachesitsgoal.Wecanevencancelthetaskatanytime.SotheChefBotGUIactsasanactionlibclient,whichsendsmapcoordinatestotheactionlibserver,thatis,thenavigationstack.
WecanruntherobotGUInowtocontroltherobotusingthefollowingcommand:
$rosrunchefbot_bringuprobot_gui.py
WecanselectatablenumberandclickontheGobuttonformovingrobottoeachtable.
Assumingthatyouclonedthefilesandgottherobot_gui.pyfile,wecandiscussthemainslotsweaddedintotheUi_Form()classfortheactionlibclientandtogetvaluesofbatterylevelandrobotstatus.
WeneedtoimportthefollowingPythonmodulesforthisGUIapplication:
importrospy
importactionlib
frommove_base_msgs.msgimport*
importtime
fromPyQt4importQtCore,QtGui
TheadditionalmoduleswerequireareROSPythonclientrospy,andtheactionlibmoduletosendvaluestothenavigationstack.Themove_base_msgsmodulecontainsthemessagedefinitionofthegoalthatneedstobesenttothenavigationstack.
TherobotpositionneareachtableismentionedinaPythondictionary.Thefollowingcodeshowshardcodevaluesoftherobot'spositionneareachtable:
table_position=dict()
table_position[0]=(-0.465,0.37,0.010,0,0,0.998,0.069)
table_position[1]=(0.599,1.03,0.010,0,0,1.00,-0.020)
table_position[2]=(4.415,0.645,0.010,0,0,-0.034,0.999)
table_position[3]=(7.409,0.812,0.010,0,0,-0.119,0.993)
table_position[4]=(1.757,4.377,0.010,0,0,-0.040,0.999)
table_position[5]=(1.757,4.377,0.010,0,0,-0.040,0.999)
table_position[6]=(1.757,4.377,0.010,0,0,-0.040,0.999)
table_position[7]=(1.757,4.377,0.010,0,0,-0.040,0.999)
table_position[8]=(1.757,4.377,0.010,0,0,-0.040,0.999)
table_position[9]=(1.757,4.377,0.010,0,0,-0.040,0.999)
Wecanaccessthepositionoftherobotneareachtablebyaccessingthisdictionary.
Currently,wehaveinsertedonlyfourvaluesforademonstrationpurpose.Youcanaddmorevaluesbyfindingthepositionofothertables.
Weareassigningsomevariablestohandlethetablenumber,thepositionofrobotandtheactionlibclientinsidetheUi_Form()class.
#Handletablenumberfromspinbox
self.table_no=0
#Storescurrenttablerobotposition
![Page 283: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/283.jpg)
self.current_table_position=0
#CreatingActionlibclient
self.client=actionlib.SimpleActionClient('move_base',MoveBaseAction)
#Creatinggoalmessagedefinition
self.goal=MoveBaseGoal()
#Startthisfunctionforupdatingbatteryandrobotstatus
self.update_values()
Thefollowingcodeshowsthesignalsandslotsassignmentinthiscodeforbuttonsandspinboxwidgets:
#Handlespinboxsignalandassigntoslotset_table_number()
QtCore.QObject.connect(self.spinBox,QtCore.SIGNAL(_fromUtf8("valueChanged(int)")),
self.set_table_number)
#HandleHomebuttonsignalandassigntoslotHome()
QtCore.QObject.connect(self.pushButton_3,QtCore.SIGNAL(_fromUtf8("clicked()")),
self.Home)
#HandleGobuttonsignalandassigntoslotGo()
QtCore.QObject.connect(self.pushButton,QtCore.SIGNAL(_fromUtf8("clicked()")),
self.Go)
#HandleCancelbuttonsignalandassigntoslotCancel()
QtCore.QObject.connect(self.pushButton_2,QtCore.SIGNAL(_fromUtf8("clicked()")),
self.Cancel)
ThefollowingslothandlesthespinboxvaluefromtheUIandassignsatablenumber.Also,itconvertsthetablenumbertothecorrespondingrobotposition:
defset_table_number(self):
self.table_no=self.spinBox.value()
self.current_table_position=table_position[self.table_no]
HereisthedefinitionoftheGoslotfortheGobutton.Thisfunctionwillinsertintotherobotpositionoftheselectedtableinagoalmessageheaderandsendittothenavigationstack:
defGo(self):
#Assigningx,y,zposeandorientationtotarget_posemessage
self.goal.target_pose.pose.position.x=float(self.current_table_position[0])
self.goal.target_pose.pose.position.y=float(self.current_table_position[1])
self.goal.target_pose.pose.position.z=float(self.current_table_position[2])
self.goal.target_pose.pose.orientation.x=
float(self.current_table_position[3])
self.goal.target_pose.pose.orientation.y=float(self.current_table_position[4])
self.goal.target_pose.pose.orientation.z=float(self.current_table_position[5])
#Frameid
self.goal.target_pose.header.frame_id='map'
#Timestamp
self.goal.target_pose.header.stamp=rospy.Time.now()
#Sendinggoaltonavigationstack
![Page 284: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/284.jpg)
self.client.send_goal(self.goal)
ThefollowingcodeistheCancel()slotdefinition.Thiswillcancelalltherobotpathsthatitwasplanningtoperformatthattime.
defCancel(self):
self.client.cancel_all_goals()
ThefollowingcodeisthedefinitionofHome().Thiswillsetthetablepositiontozero,andcalltheGo()function.Thetableatpositionzeroisthehomepositionoftherobot:
defHome(self):
self.current_table_position=table_position[0]
self.Go()
Thefollowingdefinitionsarefortheupdate_values()andadd()functions.Theupdate_values()methodwillstartupdatingthebatterylevelandrobotstatusinathread.Theadd()functionwillretrievetheROSparametersofthebatterystatusandrobotstatus,andsetthemtotheprogressbarandlabel,respectively:
defupdate_values(self):
self.thread=WorkThread()
QtCore.QObject.connect(self.thread,QtCore.SIGNAL("update(QString)"),
self.add)
self.thread.start()
defadd(self,text):
battery_value=rospy.get_param("battery_value")
robot_status=rospy.get_param("robot_status")
self.progressBar.setProperty("value",battery_value)
self.label_4.setText(_fromUtf8(robot_status))
TheWorkThread()classusedintheprecedingfunctionisgivenhere.TheWorkThread()classisinheritedfromQThreadprovidedbyQtforthreading.Thethreadsimplyemitsthesignalupdate(Qstring)withaparticulardelay.Intheprecedingfunctionupdate_values(),theupdate(QString)signalisconnectedtotheself.add()slot;sowhenasignalupdate(QString)isemittedfromthread,itwillcalltheadd()slotandupdatethebatteryandstatusvalue:
classWorkThread(QtCore.QThread):
def__init__(self):
QtCore.QThread.__init__(self)
def__del__(self):
self.wait()
defrun(self):
whileTrue:
time.sleep(0.3)#artificialtimedelay
self.emit(QtCore.SIGNAL('update(QString)'),"")
return
WehavediscussedhowtomakeaGUIforChefBot,butthisGUIisonlyfortheuserwhocontrolsChefBot.Ifsomeonewantstodebugandinspecttherobotdata,wemayhavetogoforothertools.ROSprovidesanexcellentdebuggingtooltovisualizedatafromtherobot.
![Page 285: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/285.jpg)
TherqttoolisoneofthemostpopularROStools,whichisbasedonaQt-basedframeworkforGUIdevelopmentforROS.Let'sdiscusstherqttool,installationprocedure,andhowwecaninspectthesensordatafromtherobot.
InstallingandworkingwithrqtinUbuntu14.04.2LTSrqtisasoftwareframeworkinROS,whichimplementsvariousGUItoolsintheformofplugins.Wecanaddpluginsasdockablewindowsinrqt.
InstallingrqtinUbuntu14.04.2canbedoneusingthefollowingcommand.Beforeinstalling,ensurethatyouhavethefullinstallationofROSIndigo.
$sudoapt-getinstallros-indigo-rqt
Afterinstallingtherqtpackages,wecanaccesstheGUIimplementationofrqtcalledrqt_gui,inwhichwecandockrqtpluginsinasinglewindow.
Let'sstartusingrqt_gui.
Runtheroscorecommandbeforerunningrqt_gui:
$roscore
Runthefollowingcommandtostartrqt_gui:
$rosrunrqt_guirqt_gui
Wewillgetthefollowingwindowifthecommandsworkfine:
![Page 286: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/286.jpg)
Wecanloadandunloadpluginsatruntime.ToanalyzetheROSmessagelog,wecanloadtheConsolepluginfromPlugins|Logging|Console.Inthefollowingexample,weloadtheConsolepluginandrunatalkernodeinsiderospy_tutorials,whichwillsendaHelloWorldmessagetoaTopiccalled/chatter.
Runthefollowingcommandtostartthenodetalker.py:
$rosrunrospy_tutorialstalker.py
Inthefollowingscreenshot,rqt_guiisloadedwithtwopluginsnamedConsoleandTopicMonitor.TheTopicMonitorplugincanbeloadedfromPlugins|Topics|TopicMonitor.TheConsolepluginmonitorsthemessagesprintingoneachnodesandtheirseverity.Itisveryusefulfordebuggingpurposes.Inthefollowingfigure,theleftsectionofrqt_guiisloadedwiththeConsolepluginandtherightsideisloadedwithTopicMonitor.TopicMonitorwilllistthetopicsavailableandwillmonitortheirvalues.
Inthefollowingfigure,theConsolepluginmonitorsthetalker.pynode'smessagesandtheirseveritylevelwhereasTopicMonitormonitorsthevalueinsidethe/chatterTopic.
![Page 287: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/287.jpg)
Wecanalsovisualizedatasuchasimagesandplotgraphsonrqt_gui.Forrobotnavigationanditsinspection,therearepluginsforembeddingRVizinrqt_gui.TheNavigationviewerpluginviewsthefrom/mapTopic.ThevisualizationpluginsareavailableinPlugin|Visualization.
![Page 288: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/288.jpg)
Questions1. WhatarethepopularUItoolkitsavailableontheLinuxplatform?2. WhatarethedifferencesbetweenPyQtandPySideQtbindings?3. HowdoyouconvertaQtUIfileintoPythonscript?4. WhatareQtsignalsandslots?5. Whatisrqtandwhatareitsmainapplications?
![Page 289: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/289.jpg)
SummaryInthischapter,wediscussedcreatingaGUIforChefBotthatcanbeusedbyanordinaryuserwhodoesn'thaveanyideaabouttheinternalworkingofarobot.WeusedPythonbindingofQtcalledPyQttocreatethisGUI.BeforewegotothemainGUIdesign,wesawaHelloWorldapplicationtogetaneasierunderstandingofPyQt.TheUIdesignwasdoneusingtheQtDesignertoolandtheUIfilewasconvertedintoitsequivalentPythonscriptusingPythonUIcompiler.AfterdesigningthemainGUIinQtDesigner,weconvertedtheUIfileintoPythonscriptandinsertedthenecessaryslotsinthegeneratedscript.TheChefBotGUIcanstarttherobot,selectatablenumber,andcommandtherobottogetintothatposition.ThepositionofeachtableisacquiredfromthegeneratedmapwehardcodedthepositionsinthisPythonscriptfortesting.Whenatableisselected,wesetagoalpositiononthemap,andwhenweclickontheGobutton,therobotwillmoveintothegoalposition.Theusercancanceltheoperationatanytimeandcommandtherobottocometothehomeposition.TheGUIcanalsoreceivethereal-timestatusoftherobotanditsbatterystatus.AfterdiscussingtherobotGUI,wesawthedebuggingGUItoolinROScalledrqt.Wesawsomepluginsusedfordebuggingthedatafromtherobot.Inthenextchapter,wewillseethecompletetestingandcalibrationoftherobot.
![Page 290: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/290.jpg)
Chapter12.TheCalibrationandTestingofChefBotInthischapter,wewilldiscussthecalibrationandtestingofChefBotthatisnecessarybeforedeployingtherobotintheworkplace.ThetestingcanbedoneusingtheGUIthatwebuiltinthepreviouschapter.Beforethetestrun,wecancalibratethesensorsandaddresstheissuesintheChefBothardwareandsoftware.Inthetestingprocedure,wecanbuildamapofahotelkindofarrangementandnavigateonthemapusingROSonChefBot.WecanalsoseewaystoimproveaccuracyandupgradetheChefBotprototypeinfuture.
First,wewilllookatthecalibrationofsensorssuchasKinect,Quadratureencoder,andIMUtoimprovetheaccuracyoftherobot.
TheCalibrationofXboxKinectusingROSKinectcalibrationisrequiredtoimprovetheaccuracyoftheKinectdata.Inthisrobot,Kinectisusedinsteadofalaserscanner.WecangeneratedataequivalenttothatprovidedbylaserscannerbyconvertingPointClouddata,usingadepthimagetolaserscannerconverterpackageinROS.Thisconverteddatamaynotbeaspreciseasanactuallaserscanner,soineffect,theerrorfromtheconvertedlaserscannercanaffectrobotmapping,navigation,andlocalization.Toreducetheerrorstosomeextent,wecandoacalibrationpriortoourapplication.Kinectcanevenworkonfactorysettingswithoutbeingcalibrated,eachdevicehasitsowncameraparametersandthesecanchangefromdevicetodevice.Someofthecameraparametersarefocallength,formatsizeprinciplepoint,andlensdistortion.Whenweperformcameracalibration,weareabletoadjustthesevalues.
OneofthecalibrationsusedinKinectisintrinsiccalibration.Someoftheintrinsicparametersarefocallengthanddistortionmodel.Usingintrinsiccalibration,wecancorrectthesevaluesofIR(depth)andRGBcameraintrinsicparameters.
Let'sseehowtoperformKinectintrinsiccalibrationusingROSandPython.
CalibratingtheKinectRGBcameraBeforecalibratingKinectusingROS,ensurethattheOpenNIdriverpackagesandcameracalibrationpackagesofROSareinstalled.Iftheyarenotinstalled,wecaninstallthemusingthefollowingcommand:
$sudoapt-getinstallros-indigo-openni-launchros-indigo-camera-calibration
Beforethecalibration,printan8x6checkerboardof0.108meterinlength.Wewillgetastandard8x6checkerboardfilefromthefollowinglink:
http://wiki.ros.org/camera_calibration/Tutorials/MonocularCalibration
![Page 291: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/291.jpg)
FollowtheproceduretocalibratetheRGBcamerainKinect:
1. StarttheOpenNIdriverusingthefollowingcommand.ThiswillstartKinectRGBanddepthstreamimages:
$roslaunchopenni_launchopenni.launch
2. Afterlaunchingdrivers,runthecalibratorcodeavailableonthecamera_calibrationpackage.Thecameracalibrator.pyfileisthenodethatperformscameracalibration.WehavetospecifytheRGBrawimagetopic,cameratopic,sizeofthecheckerboardandsizeofthesquarethatweareusing.Simplyrunthecalibratornodebyusingfollowingcommand:
$rosruncamera_calibrationcameracalibrator.pyimage:=/camera/rgb/image_raw
camera:=/camera/rgb--size8x6--square0.108
3. Theabovecommandwillopenthefollowingwindow:
4. AssumingthatyouhaveaprintedcheckerboardandyouholditinyourhandandshowitontheKinectRGBcamera,youcanseepatternsgettingdetected,asinthepreceedingfigure.Ifitisdetectedproperly,thenmovethecheckerboardtotheleft,right,top,andbottomofthecamera
![Page 292: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/292.jpg)
view,asshowninthefollowingfigure.Therearefourbarsontheright;theXandYbarindicatetheamountofdatacollectedinthexandydirectionandtheSizeandSkewbarsindicatethesamplesofimages,thatare,towards/awayfromthecameraandtiltedup/downrespectively.
5. Ateachstep,holdthecheckerboardstilluntiltheimagegetshighlightedbythedetectionpatterninthecalibrationwindow.Thenecessarycheckerboardpositionisshownasfollows:
6. WhenwemovethecheckerboardaroundtheRGBcamera,thesizeofthebarsincreaseandwhenthecalibrationprogramgetsenoughsamplesforcalibration,theCALIBRATEbuttonwillbecomeactive.
7. WeclickontheCALIBRATEbuttontostartcalibration.Thecalibrationprocesscantakeaboutaminute.Thecalibrationwindowmaybenon-responsiveforsometime,butitwillbereadyeventually.
![Page 293: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/293.jpg)
Afterthecalibrationprocessiscomplete,wecanseethecalibrationresultsintheterminalandthecorrectedimagewillbeshowninthecalibrationwindow.
Asuccessfulcalibrationwillresultinarectifiedimageandafailedcalibrationusuallyresultsinablankorunrecognizableimage.
Aftercalibration,wecanusetheslideronthetopofthecalibrationwindowtoadjustthesizeoftherectifiedimage.Thescalevaluewillshowaszerotherectifiedimage,andsomepixelsintheoriginalimagewillbediscarded.Ascaleof1.0meanswecanseetheoriginalimageandtherectifiedimagehasblackborderswheretherearenoinputpixelsintheoriginalimage.
Ifyouaresatisfiedwiththecalibration,clickontheCOMMIT buttontosendthecalibrationparameterstothecameraforpermanentstorage.TheGUIexitsandyoushouldseewritingcalibrationdatato...intheconsole.
![Page 294: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/294.jpg)
CalibratingtheKinectIRcameraKinectcandetectthedepthoftheenvironmentusinganIRcameraandanIRspeckleprojector,whichcanperformthesamefunctionasastereocamera.WecancalibratethedepthimagethatwegotfromKinectusingthecheckerboardthatweusedfortheRGBcameracalibration.
Thedifficultyincalibratingadepthimageisthatthespecklepatternonthedepthimagecanblocktheaccuratedetectionofthecornersofthecheckerboard.OnepossiblesolutionistocovertheIRprojectorandilluminateitwithanotherIRsourcesuchassunlight,incandescentlamp,andsoon.ThefirstfigureshowsthecheckerboardwiththespecklepatternandthesecondfigureshowsthedepthimageilluminatedbyanincandescentlampandcoveringthespeckleIRprojector.
ThePythonscriptusedfortheRGBcameracanbeusedfordepthcalibration.ThefollowingcommandisusedtocalibratetheIRcamera.Runthecalibratornodeusingthedepthimagetopic;weareusingthe8x6checkerboardwithasizeof0.108meter,asshowninthefollowingexample:
$rosruncamera_calibrationcameracalibrator.pyimage:=/camera/ir/image_raw
camera:=/camera/ir--size8x6--square0.108
TheROSdriverforKinectcannotstreambothIRandRGBimages.Itwilldecidewhichofthetwotostream,basedontheamountofsubscribers.ItisbesttonotrunanyROSnodesduringthecalibrationofthedepthimage,whichsubscribeRGBframes.
RepeatthesamemovementsoftheRGBcameracalibrationindepthcameratoo.Aftercalibration,wecanpresstheCOMMIT buttontosavethecalibrationvaluestoafile.WhenwepresstheCOMMITbutton,thevalueswillbesenttotheopenni_cameradriverpackageinaformofROSservicecall.
![Page 295: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/295.jpg)
Whenopenni_camerareceivesthecameraparameters,itwillstorethemtoafilewiththehelpofanROSpackagecalledcamera_info_manager.Thecamera_info_managerpackagecanhandletheseparametersandstoretheminsomelocation.Thedefaultlocationofintrinsicparametersis$HOME/.ros/camera_info/NAME.yamlandthenameoffilecontainsthecameranameanddeviceserialnumber.Thisfilecanbemovedtoanypubliclocationwewant.ThenameofthefilesofRGBanddepthcalibrationwilllooklikergb_A00362903124106A.yamlanddepth_A00362903124106A.yaml.
ThecontentoftheRGBcameracalibrationfileisgivenasfollows:
image_width:640
image_height:480
camera_name:rgb_A00362903124106A
camera_matrix:
rows:3
cols:3
data:[543.275251827696,0,286.5024846235134,0,544.9622371717294,
270.5536535568697,0,0,1]
distortion_model:plumb_bob
distortion_coefficients:
rows:1
cols:5
data:[0.1236660642773772,-0.2974236496563437,0.008147821573873778,
-0.03185623828978901,0]
rectification_matrix:
rows:3
cols:3
data:[1,0,0,0,1,0,0,0,1]
projection_matrix:
rows:3
cols:4
data:[531.7443237304688,0,263.0477918357592,0,0,559.802490234375,
274.1133349321171,0,0,0,1,0]
Ifthefilesareplacedinthedefaultlocation,theOpenNIdrivercanautomaticallytakethecalibrationfilesfromthislocation.Ifwewanttosavetheminsomeotherlocation,wehavetousethelaunchfilesectiongiveninthefollowingcodeandmentionthepathofthecameracalibrationfilesasargumentsofopenni.launch:
<launch>
<!--Includeofficiallaunchfileandspecifycamera_infourls-->
<includefile="$(findopenni_launch)/launch/openni.launch">
<!--provideargumentstothatlaunchfile-->
<argname="rgb_camera_info_url"
value="file:///public/path/rgb_A00362903124106A.yaml"/>
<argname="depth_camera_info_url"
value="file:///public/path/depth_A00362903124106A.yaml"/>
</include>
</launch>
![Page 296: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/296.jpg)
WheelodometrycalibrationCalibrationisrequiredinodometrytoreducenavigationalerrors.ThemainparameterneededtocalibratethisisthemeasureofDistanceperencoderticksofthewheels.Itisthedistancetraversedbytherobotwheelafterduringeachencodertick.
Thewheelbaseisthedistancebetweenthetwodifferentialdrivewheels.Distanceperencoderticksisthedistancetraversedbythewheeloneachencodercount.Wecancalibratetherobotbymonitoringencodercountsofeachwheelbydrivingforafixeddistance.Theaverageofthesecountsisdividedbythetotaldistancetraveledtogetastartingvaluefortheencoderclick,whichhappenspermillimeter.Theencodermanufacturermaymentionanencodercountinonerevolution,butinapracticalscenario,therewillbechangesinit.
Tocalibratetherobot,drivetherobotforafixeddistanceandnotedowntheencodercountsintheleftandrightmotor.Thefollowingequationcangiveanaveragecountpermillimeter:
Countspermillimeter=(leftcounts+rightcounts)/2)/totalmillimetertraveled
ErroranalysisofwheelodometryAnerrorfromthewheelodometrycanresultinaccumulationoferrorsinthepositionoftherobot.Theodometryvaluecanchangewhenthewheelslipsonthegroundormovesonanuneventerrain.Theodometryerrorgeneratedwhiletherobotrotatescancausesevereerrorsintherobot'sfinalposition.Forexample,ina10metertripofrobot,ifbothwheelsslipby1centimeter,itcancause0.1percenterrorsinthetotaldistanceandtherobotwillarrive1cmshortofitsdestination.
However,iftheslipbetweentwowheelsis1centimeter,itcancausemoreerrorsthanthefirstcasethatwediscussed.ThiscanresultinalargeerrorinbothXandYcoordinates.
Assumethattherobotwheelbaseis10centimeterandaslipof1centimeterbetweentwowheelscanresultin0.1radianerror,whichisabout5.72degrees.Givenhereistheequationtofindtheheadingerror:
headingerror=(left-right)/Wheelbase
=(0.01/0.1)
=0.1radians*(180/PI)=~5.72degrees
Wecanfindthefinalpositionoftherobotafter10meterswithaheadingerrorof0.1radian,asshownhere:
X'=10*sin(5.72)=~1meter
Y'=10*cos(5.72)=9.9meter
![Page 297: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/297.jpg)
Fromthesecalculations,weknowthataheadingerrorof0.1radiancausesashiftof1meterinthexdirectionfromthedestinationposition.Theillustrationofthiserrorisgivenasfollows:
Fromthisanalysis,weunderstoodthatasmallerrorinθproduceslargeerrorsinXandY.Themainerroraffectstheorientationoftherobotratherthanthedistancetraveled.Therearesomemethodstoreducethiserror.Thesearementionedinthefollowingsection.
ErrorcorrectionFromtheaboveanalysis,itcanbeseenthatthemostimportanterrorintherobot'spositioncalculationistheerrorinheading,theθcalculation.Someofthemethodstoreduceθerrorsareasfollows:
Digitalcompass:Wecandeployadigitalcompassintherobottogettheheadingoftherobotandtherebyreducetherobotheadingerror.Digitalcompasscanbeusedalonebutitmayencounterproblemssuchasifthereisanylocalmagneticanomaly,itcanmakeabignoiseinthereading.Also,thecompassmustbeperfectlyinclinedtotherobot'ssurface;ifthesurfaceisuneven,therewillbeerrorsinthecompassreadings.Digitalgyroscope:Thedigitalgyroscopeprovidestherateofchangeoftheangleorangularvelocity.Ifweknowtheangularvelocity,wecanfindtheanglebyintegratingthevaluesoveraperiodoftime.Wecanfindtheangularvelocityoftherobotusinggyro;butthegyrovaluecanmakeerrorstoo.Ifarobothastocoveragreatdistance,theerrorcomputingfromgyrowillincrease,sogyrocanonlybeusediftherobotcoversashortdistance.Iftherobotcoversagreatdistance,wecanusethecombinationofagyroscopeandacompass.Gyro-correctedcompass:Inthismethod,weincorporatethegyroandcompassintoasingleunit,sothatonesensorcancorrecttheother.CombiningthesesensorvaluesusingaKalmanfiltercangivebetterheadingvaluesfortherobot.
IntheChefBotprototype,weareusingGyroalone.Infutureupgrades,wewillreplacegyrowiththe
![Page 298: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/298.jpg)
combinationofgyroandcompass.
WeareusinganIMUcalledMPU-6050byInvensetogettheheadingoftherobot.ThefollowingsectionexplainsasimplecalibrationmethodtoreducetheoffsetofMPU6050.Herearetheprocedurestocalibratethesensor.
![Page 299: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/299.jpg)
CalibratingtheMPU6050MPU6050calibrationcanbedoneusingtheEnergiacode,whichisusedtodisplaytherawvaluesofthesensormentionedinChapter6,WorkingwithRoboticSensors.ThisexampleisalreadyavailableinEnergiaexamples.YouwillgetthiscodebynavigatingtoFile/Examples/MPU6050/Examples/MPU6050_raw.LoadthissketchintoEnergiaandfollowtheproceduregivenhere:
1. PlacetheMPU6050breakoutboardonaflatsurface.Wecanuseaninclinometertochecktheinclinationofthesurface.
2. Modifythecurrentprogrambysettingtheoffsettozero.Wecansettheoffsetofthreeaxesvaluesofgyroscopeto0usingthefollowingfunction:
("setXGyroOffset,setYGyroOffset,setZGyroOffset"=0)
3. UploadtherawcodetoLaunchpadandtaketheEnergiaserialmonitortoconfirmwhethertheserialdataiscomingfromtheLaunchpad.Leavethebreakoutfor5to10minutestoallowthetemperaturetostabilizeandreadthegyrovaluesandnotedownthevalues.
4. SettheoffsetvaluestothereadingsnotedanduploadthecodeagaintotheLaunchpad.5. Repeatthisprocedureuntilwegetareadingof0fromeachgyroaxis.6. Afterachievingthegoal,wefinallygettheoffsetsthatcanbeusedforfuturepurposes.
ThesearethenecessaryprocedurestocalibrateMPU6050.Aftersensorcalibration,wecantesttherobothardwareusingtheGUI.
![Page 300: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/300.jpg)
TestingoftherobotusingGUIWehavealreadydiscussedhowtobuildamapoftheenvironmentusingtherobotsimulationandrobothardware.Inthissection,wediscusshowtocommandtherobottogointoaparticularplaceofthemap.Abetterwaytofindthepositionoftherobotineachtableistomanuallydrivetherobotusingteleoperation.
AssumingthatChefBotpackagesareconfiguredinboththerobot'sPCandtheuser'sPC,thereshouldbeWi-FinetworkstowhichboththerobotanduserPCscanconnectandcommunicateusingtheIPassignedtoeachPC.ItshouldbenotedthatwehavetosetROS_MASTER_URIandROS_IP,asmentionedinChapter10,IntegrationofChefBotHardwareandInterfacingitintoROS,UsingPython.
Thefollowingprocedurecanbeusedtotesttherobotsthatareworkinginahotelenvironment:
1. RemotelogintotherobotPCfromtheuserPCusingthesshcommand.Thecommandisgivenasfollows:
$ssh<robot_pc_ip_address>
2. Iftheroomisnotmappedyet,wecanmapitusingthefollowingcommandsintherobotterminal.Starttherobotsensorsandtheodometryhandlingnodesusingthefollowingcommand:
$roslaunchchefbot_bringuprobot_standalone.launch
3. Afterstartingthenodes,wecanstartthegmappingnodesusingthefollowingcommand:
$roslaunchchefbot_bringupgmapping_demo.launch
4. Afterstartinggmapping,wecanstartthekeyboardteleoperationnodestomovetherobotusingthekeyboard:
$roslaunchchefbot_bringupkeyboard_telop.launch
5. Afterlaunchingtheteleoperation,runRVizintheusersystemtoviewthemapgenerated:
$roslaunchchefbot_bringupview_navigation.launch
![Page 301: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/301.jpg)
6. Atypicalmapisgivenintheprecedingfigure.Itcanvaryaccordingtotheenvironment.Aftergeneratingthemap,wehavetorunthefollowingcommandtosavethegeneratedmapinthehomefolder:
$rosrunmap_servermap_saver-f~/<name_of_the_map>
7. Aftersavingthemap,wecanstarttheAMCLnodeforautonomousnavigation:
$roslaunchchefbot_bringupamcl_demo.launchmap_file:=~/<map_name.yaml>
![Page 302: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/302.jpg)
8. TherobotonthemapafterstartingAMCLisshowninthefigure.AfterrunningAMCL,startthekeyboardteleoperationandmovetoeachtablereferringthemap.
9. Checkwhethertherobot'spositionisthesameinmapandintheactualenvironment.Ifthereisahugedifference,thenweneedtoremaptheroom.Ifthereislessdifference,wecanretrieveandviewrobot'spositionneareachtablewithrespecttothemap,usingthefollowingcommand:
$rosruntftf_echo/map/base_link
Wewillgetthetranslationandrotationvalueoftherobotineachposition,asshowninthefollowingscreenshot:
![Page 303: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/303.jpg)
NotethecorrespondingpositionoftherobotneareachtableandfeedittotheGUIcode.TheeditingGUIcodemustbeontheuserPC.AfterinsertingthepositionoftherobotineachtableontheGUIcode,runtheGUInodeusingthefollowingcommandontheusersystem:
$rosrunchefbot_bringuprobot_gui.py
ThefollowingGUIwillpopupandwecancontroltherobotusingthisGUI.
![Page 304: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/304.jpg)
WecancommandtherobottogointoaparticulartablebygivingthetablenumberontheGUIandpressingtheGobutton.WhenwepresstheGobutton,thecorrespondinggoalpositionissenttothenavigationstack.AftertestingtheGUI,wecandeploytherobotonthecurrentenvironment.
Wecanseetheadvantagesanddisadvantagesofthisnavigationmethod.ThenavigationmethodmainlydependsontheROSframeworkandprogrammingusingPython.
ProsandconsoftheROSnavigationThemainhighlightoftheROSnavigationisthatthecodeisopenandreusable.Also,itissimpletounderstand,evenifthehandlingtechnologyiscomplex.Peoplewithminimalcomputerknowledgecanprogramanautonomousrobot.
Theconsofthismethodare:thisisnotyetstableandisstillinthetestingstage,wecan'texpecthighaccuracyfromthismethod,andthecodemaynotbeofindustrialstandards.
![Page 305: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/305.jpg)
Questions1. WhatisthemainuseoftheintrinsiccalibrationofKinect?2. HowwecancalibrateMPU6050values?3. Howwecancalibratethewheelodometryparameters?4. Whatarethepossibleerrorsinwheelodometry?5. Whataretheprosandconsofthisrobotdesign?
![Page 306: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/306.jpg)
SummaryInthischapter,wehavediscussedallthepossibleerrorsandcalibrationsrequiredbeforestartingtherobot.Thecalibrationisrequiredtoreduceerrorsfromthesensors.WealsosawallthestepstodobeforeworkingwithrobotGUIthatwehavedesigned.Wealsohaveseentheprosandconsofthismethod.Theareawearehandlingisstillbeingresearchedsowecan'texpecthighaccuracyfromthecurrentprototype.
![Page 307: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/307.jpg)
IndexA
A*searchalgorithmabout/Wheredorobotscomefrom?
A.L.I.C.Eabout/LoadingasingleAIMLfilefromthecommand-lineargument
A.L.I.C.E.AIMLfilesworkingwith/WorkingwithA.L.I.C.E.AIMLfilesreferencelink/WorkingwithA.L.I.C.E.AIMLfilesloading,intomemory/LoadingAIMLfilesintomemoryloading/LoadingAIMLfilesandsavingtheminbrainfilessaving,inbrainfile/LoadingAIMLfilesandsavingtheminbrainfilesloading,Bootstrapmethodused/LoadingAIMLandbrainfilesusingtheBootstrapmethod
AbsoluteZero,LibreCAD/Creatinga2DCADdrawingoftherobotusingLibreCADacousticmodel
about/BlockdiagramofaspeechrecognitionsystemActiveStatePython2.7bit
URL,fordownloading/InstallationoftheSpeechSDKAdaptiveMonteCarloLocalization(AMCL)
URL/CreatingtheGazebomodelfromTurtleBotpackagesAdvancePackagingTool(APT)
about/InstallingQtonUbuntu14.04.2LTSAIML
about/IntroductiontoAIMLworkingwith/WorkingwithAIMLandPython
AIMLtagsabout/IntroductiontoAIMLtags<aiml>tag/IntroductiontoAIMLtags<category>tag/IntroductiontoAIMLtags<pattern>tag/IntroductiontoAIMLtags<template>tag/IntroductiontoAIMLtags<starindex=$n$/>ta/IntroductiontoAIMLtags<srai>tag/IntroductiontoAIMLtagsreferencelink/IntroductiontoAIMLtags
aiml_client.pyfile/aiml_client.pyaiml_server.pyfile/aiml_server.pyaiml_speech_recog_client.pyfile/aiml_speech_recog_client.pyaiml_tts_client.pyfile/aiml_tts_client.pyAlicebotfreesoftwarecommunity
URL/IntroductiontoAIMLAMCLpackage
URL/WorkingwithROSlocalizationandnavigation
![Page 308: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/308.jpg)
apt-keyabout/InstallingROSIndigoonUbuntu14.04.2
ArduinoURL/Embeddedcontrollerboard
AsusXtionPROURL/Kinect
AsusXtionPro/ListofroboticvisionsensorsandimageprocessinglibrariesAutoCAD
URL/Robotchassisdesignautonomoussystem/Moderndefinitionofarobot
Bbags/TheROSComputationGraphbaudRatevariable
about/WorkingwithDynamixelactuatorsBeagleBone
URL/CentralProcessingUnitbinaries,forWindowsandMacOSX
URL/PyQtBlender
URL/Robotchassisdesignabout/Robotchassisdesign,InstallingLibreCAD,Blender,andMeshLabinstalling/InstallingBlenderURL,fordownloading/InstallingBlenderURL,fordocumentation/InstallingBlenderURL,fortutorials/Workingwitha3DmodeloftherobotusingBlenderURL,forPythonscripting/PythonscriptinginBlender
BlenderPythonAPIsabout/IntroductiontoBlenderPythonAPIs
Block,LibreCADreferencelink/Creatinga2DCADdrawingoftherobotusingLibreCAD
blockdiagram,robotabout/Blockdiagramoftherobotmotor/Motorandencoderencoder/Motorandencodermotordriver/MotordriverEmbeddedControllerboard/Embeddedcontrollerboardultrasonicsensors/UltrasonicsensorsInertialMeasurementUnit(IMU)/InertialMeasurementUnitkinect/KinectCentralProcessingUnit/CentralProcessingUnitSpeakers/Mic/Speakers/micPowerSupply/Battery/Powersupply/battery
blockdiagram,speechrecognitionsystemabout/Blockdiagramofaspeechrecognitionsystem
![Page 309: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/309.jpg)
featureextraction/Blockdiagramofaspeechrecognitionsystemacousticmodel/Blockdiagramofaspeechrecognitionsystemlexicon/Blockdiagramofaspeechrecognitionsystemlanguagemodel/Blockdiagramofaspeechrecognitionsystemsearchalgorithm/Blockdiagramofaspeechrecognitionsystemrecognizedwords/Blockdiagramofaspeechrecognitionsystem
Bootstrapmethodused,forloadingbrainfiles/LoadingAIMLandbrainfilesusingtheBootstrapmethodused,forloadingA.L.I.C.E.AIMLfiles/LoadingAIMLandbrainfilesusingtheBootstrapmethod
bpymoduleabout/IntroductiontoBlenderPythonAPIsContextAccess/IntroductiontoBlenderPythonAPIsDataAccess/IntroductiontoBlenderPythonAPIsOperators/IntroductiontoBlenderPythonAPIs
brainfileA.L.I.C.E.AIMLfiles,savingin/LoadingAIMLfilesandsavingtheminbrainfilesloading,Bootstrapmethodused/LoadingAIMLandbrainfilesusingtheBootstrapmethod
breakoutboardURL,forpurchasing/InertialMeasurementUnit
BulletURL/IntroducingGazebo
CCADtools
SolidWorks/RobotchassisdesignAutoCAD/RobotchassisdesignMaya/RobotchassisdesignGoogleSketchUp/RobotchassisdesignInventor/RobotchassisdesignBlender/RobotchassisdesignLibreCAD/Robotchassisdesign
calibrationURL/CalibratingtheKinectRGBcamera
Carmine/ListofroboticvisionsensorsandimageprocessinglibrariesURL,forpurchasing/Listofroboticvisionsensorsandimageprocessinglibraries
casterwheelsreferencelink/Casterwheeldesign
catkindefining/IntroducingcatkinURL/Introducingcatkin
CentralProcessingUnit/CentralProcessingUnitCentreofSpeechTechnologyResearch(CSTR)
about/FestivalChefBot
![Page 310: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/310.jpg)
ROSPythondriver,writingfor/WritingaROSPythondriverforChefBotChefBotdescriptionROSpackage
creating/CreatingaChefBotdescriptionROSpackagechefbot_base_gazebo.urdf.xacro/chefbot_base_gazebo.urdf.xacrokinect.urdf.xacro/kinect.urdf.xacrochefbot_base.urdf.xacro/chefbot_base.urdf.xacro
ChefBothardwarespecifications/SpecificationsoftheChefBothardwareworking/WorkingoftheChefBothardwarebuilding/BuildingChefBothardware
ChefBotPCconfiguring/ConfiguringChefBotPCandsettingChefBotROSpackages
ChefBotPythonnodesworkingwith/WorkingwithChefBotPythonnodesandlaunchfiles
ChefBotROSlaunchfilesdefining/UnderstandingChefBotROSlaunchfiles
ChefBotROSpackageURL/WorkingwithChefBot'scontrolGUI
ChefBotROSpackagessetting/ConfiguringChefBotPCandsettingChefBotROSpackages
ChefBotsensorsinterfacing,toTivaCLaunchPad/InterfacingChefBotsensorswithTivaCLaunchPad
ChefBotsimulation,inhotelenvironmentabout/SimulatingChefBotandTurtleBotinahotelenvironment
CloudSimframeworkURL/IntroducingGazebo
CMakeList.txtandpackage.xmlfile/Whatisarobotmodel,URDF,xacro,androbotstatepublisher?cmudict
about/CMUSphinx/PocketSphinxCMUSphinx
about/CMUSphinx/PocketSphinxURL/CMUSphinx/PocketSphinx
codeinterfacing/Interfacingcode
code,ofTivaCLaunchPadinterfacing/InterfacingcodeofTivaCLaunchPad
command-lineargumentsingleAIMLfile,loadingfrom/LoadingasingleAIMLfilefromthecommand-lineargument
CommandBox,LibreCADreferencelink/Creatinga2DCADdrawingoftherobotusingLibreCAD
communicationsystem,ChefBotblockdiagram/BlockdiagramofthecommunicationsysteminChefBot
components,robot
![Page 311: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/311.jpg)
about/Whatcanwefindinarobot?physicalbody/Thephysicalbodysensors/Sensorseffectors/Effectorscontrollers/Controllers
controlGUI,ChefBotworkingwith/WorkingwithChefBot'scontrolGUIfeatures/WorkingwithChefBot'scontrolGUI
countsperrevolution(CPR)about/Processingencoderdata
CUDA,forGPUaccelerationURL/WhatisOpenCV?
cv_bridgeused,fordisplayingKineticimages/DisplayingKinectimagesusingPython,ROS,andcv_bridge
D2DCADdrawing,ofrobot
creating,LibreCADused/Creatinga2DCADdrawingoftherobotusingLibreCADbaseplatedesign/Thebaseplatedesignbaseplatepoledesign/BaseplatepoledesignWheel,Motor,andMotorClampDesign/Wheel,motor,andmotorclampdesignCasterWheelDesign/CasterwheeldesignMiddlePlateDesign/MiddleplatedesignTopPlateDesign/Topplatedesign
3Dmodelofrobot,withBlenderabout/Workingwitha3DmodeloftherobotusingBlenderPythonscripting/PythonscriptinginBlender
DARTURL/IntroducingGazebo
DCgearedmotorinterfacing,toTivaCLaunchPad/InterfacingDCgearedmotorwithTivaCLaunchPad
deadreckoningabout/InertialNavigation
degreesoffreedom(DOF)about/Introductiontothedifferentialsteeringsystemandrobotkinematics
differentialdrivemechanismdifferentialwheeledrobot/DifferentialwheeledrobotEnergiaIDE,installing/InstallingtheEnergiaIDEcode,interfacing/Interfacingcode
differentialdriverobotabout/Robotdrivemechanism
differentialwheeledrobot/Differentialwheeledrobotdifferential_drivepackage
URL/WritingaROSPythondriverforChefBot
![Page 312: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/312.jpg)
DigitalMotionProcessor(DMP)about/EmbeddedcodeforChefBot
DirectCurrent(DC)/BlockdiagramoftherobotDistanceperencoderticksofthewheels
about/WheelodometrycalibrationDynamixel
about/WorkingwithDynamixelactuatorsDynamixelactuators
workingwith/WorkingwithDynamixelactuatorsDynamixelservos
URL/WorkingwithDynamixelactuators
EEchopin
about/InterfacingHC-SR04toTivaCLaunchPadeffectors,robot
locomotion/Effectorsmanipulation/Effectors
embeddedcode,forChefBotabout/EmbeddedcodeforChefBot
EmbeddedControllerboardabout/Embeddedcontrollerboard
encoderabout/Motorandencoderselecting,forrobot/Selectingmotors,encoders,andwheelsfortherobot
encoderdataprocessing/Processingencoderdata
EnergiaURL/Embeddedcontrollerboard,Differentialwheeledrobotabout/Differentialwheeledrobotdownloading/InstallingtheEnergiaIDE
EnergiaIDEinstalling/InstallingtheEnergiaIDE
eSpeakabout/eSpeakURL/eSpeaksetting,inUbuntu14.04.2/SettingupeSpeakandFestivalinUbuntu14.04.2
ExtensibleMark-upLanguage(XML)about/IntroductiontoAIML
Ffeatureextraction
about/Blockdiagramofaspeechrecognitionsystemfeatures,Gazebo
![Page 313: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/313.jpg)
dynamicsimulation/IntroducingGazeboadvanced3DGraphics/IntroducingGazebosensorssupport/IntroducingGazeboplugins/IntroducingGazeboTCP/IPTransport/IntroducingGazeboCloudSimulation/IntroducingGazebocommand-linetools/IntroducingGazebo
Festivalabout/FestivalURL/Festivalsetting,inUbuntu14.04.2/SettingupeSpeakandFestivalinUbuntu14.04.2
fileslaunching/WorkingwithChefBotPythonnodesandlaunchfiles
forwardkinematicsequation/Explainingoftheforwardkinematicsequation
GGazebo
about/UnderstandingroboticsimulationURL/Understandingroboticsimulationdefining/IntroducingGazebofeatures/IntroducingGazeboinstalling/InstallingGazebotesting,withROSinterface/TestingGazebowiththeROSinterface
Gazebomodelcreating,fromTurtleBotpackages/CreatingtheGazebomodelfromTurtleBotpackages
GazebosimulatorURL/InstallingGazebo
General-PurposeInput/Output(GPIO)/Embeddedcontrollerboardgmappingpackage
URL/WorkingwithSLAMusingROSandKinectGoogleSketchUp
URL/Robotchassisdesigngraycodes
URL/ProcessingencoderdataGTK+
about/InstallingTurtleBotRobotpackagesonROSIndigoGUI
used,fortestingrobot/TestingoftherobotusingGUI
HH-bridge
about/MotordriverHC-SR04
about/Selectingtheultrasonicsensor
![Page 314: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/314.jpg)
interfacing,toTivaCLaunchPad/InterfacingHC-SR04toTivaCLaunchPadworking/WorkingofHC-SR04
HelloWorldGUIapplicationcreating/Qtsignalsandslotsrunning/UpandrunningofHelloWorldGUIapplication
Hello_world_publisher.py/Hello_world_publisher.pyHello_world_subscriber.py/Hello_world_subscriber.pyhierarchical(deliberative)control/Hierarchical(deliberative)controlHMM(HiddenMarkovModels)/Blockdiagramofaspeechrecognitionsystemhybridcontrol/Hybridcontrol
Iimage
displaying,withPython-OpenCVinterface/ReadinganddisplayinganimageusingthePython-OpenCVinterfacecapturing,fromwebcamera/Capturingfromwebcamera
imageprocessinglibrariesabout/Listofroboticvisionsensorsandimageprocessinglibraries
IMUworkingwith/WorkingwithInertialMeasurementUnitinertialnavigationsystem/InertialNavigationMPU6050,interfacingwithTivaCLaunchPad/InterfacingMPU6050withTivaCLaunchPadcodeofEnergia,interfacing/InterfacingcodeofEnergia,InterfacingMPU6050toLaunchpadwiththeDMPsupportusingEnergia
in-builtencoderabout/InterfacingquadratureencoderwithTivaCLaunchpad
InertialMeasurementUnit(IMU)about/chefbot_base_gazebo.urdf.xacro,InertialMeasurementUnitURL/chefbot_base_gazebo.urdf.xacro
inertialnavigationsystem/InertialNavigationInertialNavigationSystem(INS)/InertialNavigationinputpins,motordriver/InputpinsInstantaneousCenterofCurvature(ICC)
about/ExplainingoftheforwardkinematicsequationIntelDN2820FYKH/CentralProcessingUnitInterruptServiceRoutine(ISR)/QuadratureencoderinterfacingcodeInventor
URL/RobotchassisdesignInverseKinematics/InversekinematicsIRproximitysensor
workingwith/WorkingwiththeIRproximitysensorItseez
URL/WhatisOpenCV?
![Page 315: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/315.jpg)
JJulius
about/JuliusURL/Juliusspeechrecognitionaccuracy,improvingin/ImprovingspeechrecognitionaccuracyinPocketSphinxandJulius
Juliusspeechrecognizerinstalling/InstallationofJuliusspeechrecognizerandPythonmodule
KKalmanfilter
URL/InertialNavigationkinect
about/KinectURL,forpurchasing/Kinect
Kinectabout/Listofroboticvisionsensorsandimageprocessinglibrariesprogramming,withPython/ProgrammingKinectwithPythonusingROS,OpenCV,andOpenNI
Kinectimagesdisplaying,Pythonused/DisplayingKinectimagesusingPython,ROS,andcv_bridgedisplaying,ROSused/DisplayingKinectimagesusingPython,ROS,andcv_bridgedisplaying,cv_bridgeused/DisplayingKinectimagesusingPython,ROS,andcv_bridge
KinectIRcameracalibrating/CalibratingtheKinectIRcamera
KinectRGBcameracalibrating/CalibratingtheKinectRGBcamera
kinematicsequationsURL/Inversekinematics
Kobukiabout/InstallingTurtleBotRobotpackagesonROSIndigo
LL-bracket,motor
referencelink/Selectingmotors,encoders,andwheelsfortherobotlanguagemodel
about/Blockdiagramofaspeechrecognitionsystemlaserscandata
PointCloud,convertingto/ConversionofPointCloudtolaserscandatalaunchfolder
launchfiles/CreatingtheGazebomodelfromTurtleBotpackagesLaunchpad
functionalities/Embeddedcontrollerboard
![Page 316: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/316.jpg)
laws,roboticsfirstlaw/Wheredorobotscomefrom?secondlaw/Wheredorobotscomefrom?thirdlaw/Wheredorobotscomefrom?zerothlaw/Wheredorobotscomefrom?
LayerList,LibreCADreferencelink/Creatinga2DCADdrawingoftherobotusingLibreCAD
levelconvertorURL/InterfacingDCgearedmotorwithTivaCLaunchPad
levelshifterabout/WorkingoftheChefBothardwareURL/InterfacingChefBotsensorswithTivaCLaunchPad
lexiconabout/Blockdiagramofaspeechrecognitionsystem
LibreCADURL/Robotchassisdesignabout/Robotchassisdesign,InstallingLibreCAD,Blender,andMeshLabreferences,forinstalling/InstallingLibreCAD,Blender,andMeshLabURL,fordocumentation/InstallingLibreCAD,Blender,andMeshLabinstalling/InstallingLibreCAD
LibreCADtoolsreferencelink/Creatinga2DCADdrawingoftherobotusingLibreCADCommandBox/Creatinga2DCADdrawingoftherobotusingLibreCADLayerList/Creatinga2DCADdrawingoftherobotusingLibreCADBlock/Creatinga2DCADdrawingoftherobotusingLibreCADAbsoluteZero/Creatinga2DCADdrawingoftherobotusingLibreCAD
linesperrevolution(LPR)about/Processingencoderdata
LM(LanguageModel)/WorkingwithPocketSphinxPythonbindinginUbuntu14.04.2loop()function/Quadratureencoderinterfacingcode
MMacOSX
URL/InstallingtheEnergiaIDEmathematicalmodeling,ofrobot
about/Mathematicalmodelingoftherobotsteeringsystem/Introductiontothedifferentialsteeringsystemandrobotkinematicsrobotkinematics/Introductiontothedifferentialsteeringsystemandrobotkinematicsforwardkinematicsequation/ExplainingoftheforwardkinematicsequationInverseKinematics/Inversekinematics
MayaURL/Robotchassisdesign
memoryA.L.I.C.E.AIMLfiles,loadinginto/LoadingAIMLfilesintomemory
MeshLab
![Page 317: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/317.jpg)
about/Robotchassisdesign,InstallingLibreCAD,Blender,andMeshLabinstalling/InstallingMeshLabURL,forsourcecode/InstallingMeshLabURL/chefbot_base.urdf.xacro
methods,forreducingerrorsdigitalcompass/Errorcorrectiondigitalgyroscope/Errorcorrectiongyro-correctedcompass/Errorcorrection
motorabout/Motorandencoderselecting,forrobot/Selectingmotors,encoders,andwheelsfortherobot
motordriverabout/Motordriverselecting/Selectingamotordriver/controllerreferencelink/Selectingamotordriver/controllerinputpins/Inputpinsoutputpins/Outputpinspowersupplypins/Powersupplypins
motorrotationfeedbackreferencelink/Motorandencoder
motors,RobotDriveMechanismselecting/SelectionofmotorsandwheelsRPMofmotors,calculating/CalculationofRPMofmotorsMotorTorque,calculating/Calculationofmotortorque
mountinghubreferencelink/Selectingmotors,encoders,andwheelsfortherobot
MPU6050about/InertialNavigationinterfacing,withTivaCLaunchPad/InterfacingMPU6050withTivaCLaunchPadcalibrating/CalibratingtheMPU6050
MPU6050librarysetting,inEnergia/SettinguptheMPU6050libraryinEnergia
NNaturalinteraction(NI)/WhatisOpenNINextUnitofComputing(NUC)/CentralProcessingUnitnServosvariable
about/WorkingwithDynamixelactuatorsNUC
URL,forpurchasing/CentralProcessingUnitNumPy
about/ReadinganddisplayinganimageusingthePython-OpenCVinterfaceURL/ReadinganddisplayinganimageusingthePython-OpenCVinterface
NXPURL/Listofroboticvisionsensorsandimageprocessinglibraries
![Page 318: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/318.jpg)
Oodometryvalues
about/ExplainingoftheforwardkinematicsequationOGREframework
URL/IntroducingGazeboOpenCV
about/WhatisOpenCV?applications/WhatisOpenCV?installing/InstallationofOpenCVfromsourcecodeinUbuntu14.04.2URL/InstallationofOpenCVfromsourcecodeinUbuntu14.04.2
OpenCV,inMacOSXURL/WhatisOpenCV?
OpenCV,inWindowsURL/WhatisOpenCV?
OpenCV-PythontutorialsURL/Capturingfromwebcamera
OpenCVsupportROSpackage,creatingwith/CreatingROSpackagewithOpenCVsupport
OpenDynamicsEngine(ODE)URL/IntroducingGazebo
OpenNIabout/WhatisOpenNIinstalling,inUbuntu14.04.2/InstallingOpenNIinUbuntu14.04.2
OpenNIdriverlaunching/HowtolaunchOpenNIdriver
OpenSlamURL/WorkingwithSLAMusingROSandKinect
OpenSourceComputerVision(OpenCV)about/Understandingroboticsimulation
outputpins,motordriver/Outputpins
PPCL
about/WhatisPCL?URL,fordownloading/WhatisPCL?
PCM(PulseCodeModulation)/BlockdiagramofaspeechrecognitionsystemPersonalComputer(PC)
about/Robotchassisdesignphysicalworld/Moderndefinitionofarobotpinout,TexasInstrumentLaunchpadseries
referencelink/Embeddedcontrollerboardpins,motordrivers
about/InterfacingDCgearedmotorwithTivaCLaunchPadpitch
![Page 319: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/319.jpg)
about/IntroductiontothedifferentialsteeringsystemandrobotkinematicsPixy/CMUcam5
URL/ListofroboticvisionsensorsandimageprocessinglibrariesPocketSphinx
about/CMUSphinx/PocketSphinxsetting,inUbuntu14.04.2/SettingupPocketSphinxanditsPythonbindinginUbuntu14.04.2speechrecognitionaccuracy,improvingin/ImprovingspeechrecognitionaccuracyinPocketSphinxandJulius
pocketsphinxpackageinstalling,inROSIndigo/InstallationofthepocketsphinxpackageinROSIndigo
PocketSphinxPythonbinding,Ubuntu14.04.2about/WorkingwithPocketSphinxPythonbindinginUbuntu14.04.2
pocketsphinxROSpackagereferencelink/WorkingwithSpeechrecognitioninROSIndigoandPython
PointCloudworkingwith/WorkingwithPointCloudsusingKinect,ROS,OpenNI,andPCLgeneration/OpeningdeviceandPointCloudgenerationconverting,tolaserscandata/ConversionofPointCloudtolaserscandata
PololuURL/Selectingmotors,encoders,andwheelsfortherobot
PololuH-Bridgeusing/InterfacingDCgearedmotorwithTivaCLaunchPad
portNamevariableabout/WorkingwithDynamixelactuators
PowerSupply/Battery/Powersupply/batterypowersupplypins,motordriver/PowersupplypinsPrintedCircuitBoard(PCB)
about/BuildingChefBothardwarepulsesperrevolution(PPR)
about/ProcessingencoderdataPyAIML
about/IntroductiontoPyAIMLinstalling,inUbuntu14.04.2/InstallingPyAIMLonUbuntu14.04.2installing,fromsourcecode/InstallingPyAIMLfromsourcecode
PyAIML,integratingintoROSabout/IntegratingPyAIMLintoROSaiml_server.pyfile/aiml_server.pyaiml_client.pyfile/aiml_client.pyaiml_tts_client.pyfile/aiml_tts_client.pyaiml_speech_recog_client.pyfile/aiml_speech_recog_client.pystart_chat.launchfile/start_chat.launchstart_tts_chat.launchfile/start_tts_chat.launchstart_speech_chat.launchfile/start_speech_chat.launch
pydynamixel
![Page 320: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/320.jpg)
about/WorkingwithDynamixelactuatorsURL/WorkingwithDynamixelactuators
pyjuliusURL,fordownloading/InstallationofJuliusspeechrecognizerandPythonmodule
PyQtabout/PyQtURL/PyQtinstalling,inUbuntu14.04.1LTS/InstallingPyQtonUbuntu14.04.2LTSworkingwith/WorkingwithPyQtandPySide
PyQtcodeslotdefinition,addingto/AddingaslotdefinitiontoPyQtcode
PySerialmoduleabout/InterfacingTivaCLaunchPadwithPythonURL/InterfacingTivaCLaunchPadwithPython
PySideabout/PySideURL/PySideinstalling,onUbuntu14.04.2LTS/InstallingPySideonUbuntu14.04.2LTSworkingwith/WorkingwithPyQtandPySide
PythonTivaCLaunchPad,interfacingwith/InterfacingTivaCLaunchPadwithPythonKinetic,programmingwith/ProgrammingKinectwithPythonusingROS,OpenCV,andOpenNIused,fordisplayingKineticimages/DisplayingKinectimagesusingPython,ROS,andcv_bridgeworkingwith/WorkingwithAIMLandPython
Python-Juliusclientcode,speechrecognition/Python-JuliusclientcodePython-OpenCVinterface
image,readingwith/ReadinganddisplayinganimageusingthePython-OpenCVinterfaceimage,displayingwith/ReadinganddisplayinganimageusingthePython-OpenCVinterface
PythonAPIs,Blenderabout/IntroductiontoBlenderPythonAPIs
Pythonbindingsetting,inUbuntu14.04.2/SettingupPocketSphinxanditsPythonbindinginUbuntu14.04.2
Pythonbindings,ofQtworkingwith/WorkingwithPythonbindingsofQtPyQt/PyQtPySide/PySide
PythoncodeUIfile,convertinginto/ConvertingaUIfileintoPythoncode
Pythonmoduleinstalling/InstallationofJuliusspeechrecognizerandPythonmodule
Pythonscript,robotmodel/Pythonscriptoftherobotmodel
![Page 321: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/321.jpg)
Pythonwrapper,forWindowsSpeechSDKURL,fordownloading/InstallationoftheSpeechSDK
pyuic4about/ConvertingaUIfileintoPythoncode
QQt
installing,onUbuntu14.04.2LTS/InstallingQtonUbuntu14.04.2LTSabout/InstallingQtonUbuntu14.04.2LTSURL/InstallingQtonUbuntu14.04.2LTSsignals/Qtsignalsandslotsslots/Qtsignalsandslots
QtDesignerdefining/IntroducingQtDesigner
quadratureencoderinterfacing,withTivaCLaunchPad/InterfacingquadratureencoderwithTivaCLaunchpad
quadratureencoderinterfacingcodeabout/Quadratureencoderinterfacingcode
RRaspberryPi
URL/CentralProcessingUnitreactivecontrol/Reactivecontrolreal-timespeechrecognition,GStreamer/Real-timespeechrecognitionusingPocketSphinx,GStreamer,andPythoninUbuntu14.04.2real-timespeechrecognition,PocketSphinx/Real-timespeechrecognitionusingPocketSphinx,GStreamer,andPythoninUbuntu14.04.2real-timespeechrecognition,Python/Real-timespeechrecognitionusingPocketSphinx,GStreamer,andPythoninUbuntu14.04.2recognitiongrammar,Julius
referencelink/ImprovingspeechrecognitionaccuracyinPocketSphinxandJuliusrecognizedwords
about/BlockdiagramofaspeechrecognitionsystemRoboLogix
about/Understandingroboticsimulationrobot
about/Whatisarobot?,Wheredorobotscomefrom?,Whatcanwefindinarobot?history/Historyofthetermrobot,Wheredorobotscomefrom?defining/Moderndefinitionofarobotbuilding/Howdowebuildarobot?reactivecontrol/Reactivecontrolhierarchical(deliberative)control/Hierarchical(deliberative)controlhybridcontrol/Hybridcontroltesting,GUIused/TestingoftherobotusingGUI
![Page 322: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/322.jpg)
Robotchassisdesignabout/Robotchassisdesign
RobotDriveMechanismabout/Robotdrivemechanismmotors,selecting/Selectionofmotorsandwheelswheels,selecting/Selectionofmotorsandwheelsdesignsummary/Thedesignsummary
robotdynamicsabout/Introductiontothedifferentialsteeringsystemandrobotkinematics
roboticsabout/Wheredorobotscomefrom?
roboticsimulationabout/Understandingroboticsimulationadvantages/Understandingroboticsimulationdisadvantages/Understandingroboticsimulationmathematicalmodeling,ofrobot/MathematicalmodelingoftherobotROS/IntroductiontoROSandGazeboGazebo/IntroductiontoROSandGazeboROSIndigo,installingonUbuntu14.04.1/InstallingROSIndigoonUbuntu14.04.2ChefBotsimulation,inhotelenvironment/SimulatingChefBotandTurtleBotinahotelenvironment
roboticsimulatorapplicationsGazebo/UnderstandingroboticsimulationV-REP/Understandingroboticsimulationwebots/UnderstandingroboticsimulationRoboLogix/Understandingroboticsimulation
roboticvisionsensorsabout/Listofroboticvisionsensorsandimageprocessinglibraries
robotkinematics/Introductiontothedifferentialsteeringsystemandrobotkinematicsrobotmodel/Whatisarobotmodel,URDF,xacro,androbotstatepublisher?RobotShop
URL/CalculationofRPMofmotorsrobotstatepublisher/Whatisarobotmodel,URDF,xacro,androbotstatepublisher?robot_state_publisher
about/Whatisarobotmodel,URDF,xacro,androbotstatepublisher?roll
about/IntroductiontothedifferentialsteeringsystemandrobotkinematicsRoomba
about/RobotchassisdesignROS
about/IntroductiontoROSandGazebodefining/IntroductiontoROSandGazeboURL/IntroductiontoROSandGazebofeatures/IntroductiontoROSandGazebofilesystem/ROSConcepts
![Page 323: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/323.jpg)
ComputationGraph/ROSConceptscommunity/ROSConceptsused,fordisplayingKineticimages/DisplayingKinectimagesusingPython,ROS,andcv_bridgeused,forcalibratingXboxKinect/TheCalibrationofXboxKinectusingROS
ROS-PCLpackageURL,fordownloading/WhatisPCL?
ros-usersmailinglistabout/TheROScommunitylevel
ROScommunitylevelabout/TheROScommunitylevelDistributions/TheROScommunitylevelRepositories/TheROScommunitylevelROSWiki/TheROScommunitylevelMailingLists/TheROScommunitylevel
ROSComputationGraphabout/TheROSComputationGraphnodes/TheROSComputationGraphROSMaster/TheROSComputationGraphParameterserver/TheROSComputationGraphmessages/TheROSComputationGraphtopics/TheROSComputationGraphservices/TheROSComputationGraphbags/TheROSComputationGraph
rosdepabout/InstallingROSIndigoonUbuntu14.04.2
ROSfilesystemabout/TheROSfilesystempackages/TheROSfilesystemPackageManifests/TheROSfilesystemmessage(msg)types/TheROSfilesystemservice(srv)types/TheROSfilesystem
ROSIndigoinstalling,onUbuntu14.04.1/InstallingROSIndigoonUbuntu14.04.2catkin,defining/IntroducingcatkinROSpackage,creating/CreatinganROSpackageHello_world_publisher.py/Hello_world_publisher.pyHello_world_subscriber.py/Hello_world_subscriber.pyGazebo,defining/IntroducingGazeboGazebo,installing/InstallingGazeboGazebo,testingwithROSinterface/TestingGazebowiththeROSinterfaceTurtleBotRobotpackages,installing/InstallingTurtleBotRobotpackagesonROSIndigoTurtleBotRobotpackages,installinginUbuntu/InstallingTurtleBotROSpackagesusingtheaptpackagemanagerinUbuntuTurtleBot,simulating/SimulatingTurtleBotusingGazeboandROS
![Page 324: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/324.jpg)
Gazebomodel,creatingfromTurtleBotpackages/CreatingtheGazebomodelfromTurtleBotpackagesrobotmodel/Whatisarobotmodel,URDF,xacro,androbotstatepublisher?URDF/Whatisarobotmodel,URDF,xacro,androbotstatepublisher?xacro/Whatisarobotmodel,URDF,xacro,androbotstatepublisher?robotstatepublisher/Whatisarobotmodel,URDF,xacro,androbotstatepublisher?ChefBotdescriptionROSpackage,creating/CreatingaChefBotdescriptionROSpackagepocketsphinxpackage,installingin/InstallationofthepocketsphinxpackageinROSIndigo
ROSinterface,ofOpenCVabout/TheROSinterfaceofOpenCV
ROSlocalizationworkingwith/WorkingwithROSlocalizationandnavigation
ROSnavigationworkingwith/WorkingwithROSlocalizationandnavigationpros/ProsandconsoftheROSnavigationcons/ProsandconsoftheROSnavigation
ROSpackagecreating/CreatinganROSpackagecreating,withOpenCVsupport/CreatingROSpackagewithOpenCVsupport
ROSpackagesURL/InstallingROSIndigoonUbuntu14.04.2,CreatingtheGazebomodelfromTurtleBotpackages
ROSPythondriverwriting,forChefBot/WritingaROSPythondriverforChefBot
Rossum'sUniversalRobots(R.U.R)/Historyofthetermrobotrqt
about/InstallingandworkingwithrqtinUbuntu14.04.2LTSinstalling,inUbuntu14.04.2LTS/InstallingandworkingwithrqtinUbuntu14.04.2LTSdefining,inUbuntu14.04.2LTS/InstallingandworkingwithrqtinUbuntu14.04.2LTS
rvizabout/SimulatingChefBotandTurtleBotinahotelenvironmentURL/SimulatingChefBotandTurtleBotinahotelenvironment
SSAPI(SpeechApplicationProgrammingInterface)
about/WorkingwithspeechrecognitionandsynthesisinWindowsusingPythonsearchalgorithm
about/BlockdiagramofaspeechrecognitionsystemSerialClockLine(SCL)
about/InertialMeasurementUnitSerialDataLine(SDA)
about/InertialMeasurementUnitserialmodule
about/InterfacingTivaCLaunchPadwithPythonServiceRobot
![Page 325: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/325.jpg)
requisites/TheRequirementsofaservicerobotsetuptools,Python
URL,fordownloading/InstallationofJuliusspeechrecognizerandPythonmoduleShakey
about/Wheredorobotscomefrom?SharpGP2D12sensor
URL/WorkingwiththeIRproximitysensorsignals
about/QtsignalsandslotsSimbody
URL/IntroducingGazeboSimultaneousLocalizationAndMapping(SLAM)/CreatingtheGazebomodelfromTurtleBotpackagessingleAIMLfile
loading,fromcommand-lineargument/LoadingasingleAIMLfilefromthecommand-lineargument
SLAMworkingwith/WorkingwithSLAMusingROSandKinect
SLAM,onROSworkingwith/WorkingwithSLAMonROStobuildthemapoftheroom
slotabout/Qtsignalsandslots
slotdefinitionadding,toPyQtcode/AddingaslotdefinitiontoPyQtcode
SolidWorksURL/Robotchassisdesign
sourcecodePyAIML,installingin/InstallingPyAIMLfromsourcecode
spamfilters/ModerndefinitionofarobotSpeakers/Mic/Speakers/micspecifications,ChefBothardware/SpecificationsoftheChefBothardwarespeechrecognition
about/Understandingspeechrecognitiondecoding,fromwavefile/WorkingwithPocketSphinxPythonbindinginUbuntu14.04.2
speechrecognition,inUbuntu14.04.2about/WorkingwithspeechrecognitionandsynthesisinUbuntu14.04.2usingPythonoutput/Output
speechrecognition,Julius/SpeechrecognitionusingJuliusandPythoninUbuntu14.04.2speechrecognition,Python/SpeechrecognitionusingJuliusandPythoninUbuntu14.04.2
about/WorkingwithSpeechrecognitioninROSIndigoandPythonspeechrecognition,ROSIndigo
about/WorkingwithSpeechrecognitioninROSIndigoandPythonspeechrecognitionaccuracy
improving,inPocketSphinx/ImprovingspeechrecognitionaccuracyinPocketSphinxandJulius
![Page 326: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/326.jpg)
improving,inJulius/ImprovingspeechrecognitionaccuracyinPocketSphinxandJuliusspeechrecognitionlibraries
about/SpeechrecognitionlibrariesCMUSphinx/CMUSphinx/PocketSphinxPocketSphinx/CMUSphinx/PocketSphinxJulius/Julius
speechrecognitionsystemblockdiagram/Blockdiagramofaspeechrecognitionsystem
SpeechSDKinstalling/InstallationoftheSpeechSDKURL,fordownloading/InstallationoftheSpeechSDK
speechsynthesisabout/Speechsynthesis
speechsynthesis,inUbuntu14.04.2about/WorkingwithspeechrecognitionandsynthesisinUbuntu14.04.2usingPythonoutput/Output
speechsynthesis,Python/WorkingwithspeechsynthesisinROSIndigoandPythonspeechsynthesis,ROSIndigo/WorkingwithspeechsynthesisinROSIndigoandPythonspeechsynthesislibraries
about/SpeechsynthesislibrarieseSpeak/eSpeakFestival/Festival
speechsynthesisstagestextanalysis/Speechsynthesisphoneticanalysis/Speechsynthesisprosodicanalysis/Speechsynthesisspeechsynthesis/Speechsynthesis
StanfordResearchInstitute(SRI)about/Wheredorobotscomefrom?
start_chat.launchfile/start_chat.launchstart_speech_chat.launchfile/start_speech_chat.launchstart_tts_chat.launchfile/start_tts_chat.launchstatetransitions
about/Processingencoderdatasteeringsystem/IntroductiontothedifferentialsteeringsystemandrobotkinematicsSTereoLithography(STL)/IntroductiontoBlenderPythonAPIssynaptic
about/InstallingTurtleBotRobotpackagesonROSIndigosysmodule
about/InterfacingTivaCLaunchPadwithPythonSystemOnChip(SOC)/Listofroboticvisionsensorsandimageprocessinglibraries
Ttexttospeech(TTS)/Speakers/mictf
![Page 327: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/327.jpg)
URL/Whatisarobotmodel,URDF,xacro,androbotstatepublisher?TivaCLaunchPad
URL/Embeddedcontrollerboardabout/EmbeddedcontrollerboardDCgearedmotor,interfacingto/InterfacingDCgearedmotorwithTivaCLaunchPadquadratureencoder,interfacingwith/InterfacingquadratureencoderwithTivaCLaunchpadHC-SR04,interfacingto/InterfacingHC-SR04toTivaCLaunchPadinterfacing,withPython/InterfacingTivaCLaunchPadwithPythonMPU6050,interfacingwith/InterfacingMPU6050withTivaCLaunchPadChefBotsensors,interfacingto/InterfacingChefBotsensorswithTivaCLaunchPad
topics/TheROSComputationGraphTrigpin
about/InterfacingHC-SR04toTivaCLaunchPadtruthtable
about/InterfacingDCgearedmotorwithTivaCLaunchPadTurtleBot
URL/Robotchassisdesignsimulating,Gazeboused/SimulatingTurtleBotusingGazeboandROSsimulating,ROSused/SimulatingTurtleBotusingGazeboandROS
TurtleBotpackagesGazebomodel,creatingfrom/CreatingtheGazebomodelfromTurtleBotpackages
TurtleBotrobotchassisdesignabout/Robotchassisdesign
TurtleBotRobotpackagesinstalling,onROSIndigo/InstallingTurtleBotRobotpackagesonROSIndigoURL/InstallingTurtleBotRobotpackagesonROSIndigo
TurtlebotROSpackagesinstalling,aptpackagemanagerused/InstallingTurtleBotROSpackagesusingtheaptpackagemanagerinUbuntu
UUbuntu14.04.2
OpenNI,installingin/InstallingOpenNIinUbuntu14.04.2PocketSphinx,settingin/SettingupPocketSphinxanditsPythonbindinginUbuntu14.04.2Pythonbinding,settingin/SettingupPocketSphinxanditsPythonbindinginUbuntu14.04.2eSpeak,settingin/SettingupeSpeakandFestivalinUbuntu14.04.2Festival,settingin/SettingupeSpeakandFestivalinUbuntu14.04.2PyAIML,installingin/InstallingPyAIMLonUbuntu14.04.2
Ubuntu14.04.2LTSQt,installingon/InstallingQtonUbuntu14.04.2LTS
UIfileconverting,intoPythoncode/ConvertingaUIfileintoPythoncode
![Page 328: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/328.jpg)
ultrasonicdistancesensorsworkingwith/WorkingwithultrasonicdistancesensorsHC-SR04,interfacingtoTivaCLaunchPad/InterfacingHC-SR04toTivaCLaunchPad
ultrasonicsensorselecting/Selectingtheultrasonicsensor
ultrasonicsensorsabout/Ultrasonicsensors
UniformResourceIdentifier(URI)about/WorkingwithChefBotPythonnodesandlaunchfiles
UniversalAsynchronousReceiver/Transmitter(UART)/EmbeddedcontrollerboardUnmannedAerialVehicles(UAVs)/WorkingwithInertialMeasurementUnitUpdate_Time()function
about/EmbeddedcodeforChefBotURDF/Whatisarobotmodel,URDF,xacro,androbotstatepublisher?
URL/Whatisarobotmodel,URDF,xacro,androbotstatepublisher?,chefbot_base_gazebo.urdf.xacrofunctionality/CreatingaChefBotdescriptionROSpackagechefbot_base.urdf.xacro/CreatingaChefBotdescriptionROSpackagechefbot_base_gazebo.urdf.xacro/CreatingaChefBotdescriptionROSpackagechefbot_gazebo.urdf.xacro/CreatingaChefBotdescriptionROSpackagechefbot_library.urdf.xacro/CreatingaChefBotdescriptionROSpackagechefbot_properties.urdf.xacro/CreatingaChefBotdescriptionROSpackagecommon_properties.urdf.xacro/CreatingaChefBotdescriptionROSpackagekinect.urdf.xacro/CreatingaChefBotdescriptionROSpackagechefbot_circles_kinect_urdf.xacro/CreatingaChefBotdescriptionROSpackage
VV-REP
about/UnderstandingroboticsimulationURL/Understandingroboticsimulation
VirtualBoxURL/InstallingROSIndigoonUbuntu14.04.2
VNH2SP30motorabout/InterfacingChefBotsensorswithTivaCLaunchPad
Wwavefile
speechrecognition,decodingfrom/WorkingwithPocketSphinxPythonbindinginUbuntu14.04.2
webcameraimage,capturingfrom/Capturingfromwebcamera
webotsabout/Understandingroboticsimulation
wheelbase
![Page 329: the-eye.eu · Table of Contents Learning Robotics Using Python Credits About the Author About the Reviewers Support files, eBooks, discount offers, and more Why subscribe? Free](https://reader036.vdocuments.mx/reader036/viewer/2022070110/6048e5c980e6911cf74d0afb/html5/thumbnails/329.jpg)
about/Wheelodometrycalibrationwheelencoders
about/Explainingoftheforwardkinematicsequationwheelodometrycalibration
defining/Wheelodometrycalibrationerroranalysis,ofwheelodometry/Erroranalysisofwheelodometryerrorcorrection/Errorcorrection
wheelsselecting,forrobot/Selectingmotors,encoders,andwheelsfortherobot
wheels,RobotDriveMechanismselecting/Selectionofmotorsandwheels
WindowsURL/InstallingtheEnergiaIDE
WindowsSpeechSDKabout/WindowsSpeechSDK
WiringURL/Differentialwheeledrobot
worldfiles/Whatisarobotmodel,URDF,xacro,androbotstatepublisher?
Xxacro/Whatisarobotmodel,URDF,xacro,androbotstatepublisher?
URL/Whatisarobotmodel,URDF,xacro,androbotstatepublisher?XboxKinect
calibrating,ROSused/TheCalibrationofXboxKinectusingROS
Yyaw
about/Introductiontothedifferentialsteeringsystemandrobotkinematics