robocar: an iot gesture-controlled...

34
University of Manchester School of Computer Science Third Year Project Report 2015 Robocar: An IoT gesture-controlled robot Author: Atanas Abrashev Supervisor: Jim Garside

Upload: doankiet

Post on 22-Mar-2018

226 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

University of ManchesterSchool of Computer Science

Third Year Project Report 2015

Robocar: An IoT gesture-controlled robot

Author: Atanas Abrashev

Supervisor: Jim Garside

Page 2: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

Abstract

Robocar: An IoT gesture-controlled robot

Author: Atanas Abrashev

The amount of devices being able to connect to the internet and communicate between eachother has been growing very quickly in the past decade. Advancements in network communi-cation, as well as microprocessor and sensor design have given rise to new and exciting waysto make real objects talk to each other. This report describes a project developed to investigatethe challenges of the Internet-of-Things scenario where we have two or more objects commu-nicating between each other while simultaneously manipulating large amounts of data.

A wireless robot was built that one could control from anywhere in the world, as long as theyhave internet connection. Moreover, several means of controlling the robot were created, ex-ploring the notion of using hand gestures in the process. Various communication technologieswere investigated and the result is a very responsive system which has almost no latency over-head and is largely dependent on the quality of the internet connection between the robot andthe other communicating parties.

Supervisor: Jim Garside

Page 3: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

Acknowledgements

I would like to thank my girlfriend for her constant support in hard times andmy parents for all the sacrifices they have made so I could attend university. I loveyou and I am forever grateful to you.

1

Page 4: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

Contents

1 Introduction 61.1 Overview and motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61.2 Aims and Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61.3 Report structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

2 Background and Research 82.1 Robot chassis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

2.1.1 Steering Principles . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82.1.2 Size and Power . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82.1.3 The chosen platform . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

2.2 Motor Controller . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92.2.1 The T’Rex Motor controller . . . . . . . . . . . . . . . . . . . . . . . 10

2.3 Raspberry Pi . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102.3.1 The Raspberry Pi Camera Module . . . . . . . . . . . . . . . . . . . . 10

2.4 Batteries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112.5 Leap Motion Controller . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112.6 LeapJS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122.7 The web server . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

2.7.1 Node.js . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142.7.2 Express.js . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142.7.3 Socket.io . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142.7.4 Jade . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142.7.5 Johnny-five . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

3 Design 153.1 System overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153.2 Hardware design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153.3 Software design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

3.3.1 Moving control code to the web browser . . . . . . . . . . . . . . . . . 173.3.2 Web interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173.3.3 Web server . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183.3.4 Communication Protocols . . . . . . . . . . . . . . . . . . . . . . . . 183.3.5 Control message . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

3.4 Controller Interfaces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193.4.1 Controller loop . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193.4.2 Keyboard . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193.4.3 Phone orientation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

2

Page 5: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

3.4.4 Gesture control: using hand orientation . . . . . . . . . . . . . . . . . 203.4.5 Gesture control: using hand position . . . . . . . . . . . . . . . . . . . 20

4 Implementation 214.1 Robocar setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 214.2 Web interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

4.2.1 Adjusting the input data for differential steering . . . . . . . . . . . . . 214.2.2 Keyboard control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224.2.3 Controlling from the phone - using accelerometer data . . . . . . . . . 224.2.4 Gesture control - using the Leap Motion controller . . . . . . . . . . . 234.2.5 Gesture control - using hand orientation . . . . . . . . . . . . . . . . . 234.2.6 Gesture control - using hand position . . . . . . . . . . . . . . . . . . 244.2.7 The result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

4.3 Web server . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254.3.1 Content delivery system . . . . . . . . . . . . . . . . . . . . . . . . . 254.3.2 Socket server . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 264.3.3 Receiving control messages . . . . . . . . . . . . . . . . . . . . . . . 264.3.4 Processing the control messages . . . . . . . . . . . . . . . . . . . . . 264.3.5 Pseudocode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

5 Testing and Evaluation 275.1 Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 275.2 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 275.3 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

6 Conclusion 296.1 Achievements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 296.2 Taking the project further . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 296.3 Closing remarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

Bibliography 30

A Camera setup 31A.1 Prerequisites . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31A.2 Starting the stream . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

B Communication between the Pi and the T’Rex controller 32B.1 I2C . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32B.2 USB . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

3

Page 6: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

List of Figures

1.1 Project overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

2.1 The Wild Thumper robot chassis. [5] . . . . . . . . . . . . . . . . . . . . . . . 92.2 The T’Rex Motor Controller. [7] . . . . . . . . . . . . . . . . . . . . . . . . . 92.3 Raspberry Pi 1 Model B+ [4] . . . . . . . . . . . . . . . . . . . . . . . . . . . 102.4 Raspberry Pi Camera Module [6] . . . . . . . . . . . . . . . . . . . . . . . . . 112.5 Leap Motion controller [1] . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112.6 The Leap Motion controllers view of your hands [3] . . . . . . . . . . . . . . . 122.7 The Leap Motion coordinate system. [2] . . . . . . . . . . . . . . . . . . . . . 122.8 Synchronous versus asynchronous server model . . . . . . . . . . . . . . . . . 13

3.1 High-level system overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153.2 Illustration of all hardware components of the robot wired together . . . . . . . 163.3 Software design overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173.4 Web user interface design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183.5 Phone Orientation control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193.6 Hand orientation control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203.7 Using hand position as control input . . . . . . . . . . . . . . . . . . . . . . . 20

4.1 The final look of the web user interface . . . . . . . . . . . . . . . . . . . . . 25

4

Page 7: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

List of Tables

5.1 List of estimated ping responses between robocar and the world . . . . . . . . 28

5

Page 8: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

Chapter 1

Introduction

This chapter explains the motivation behind the project, the main objectives and the overallreport structure.

1.1 Overview and motivationGoing into the second decade of the 21st century we are seeing an ever increasing amount ofcloud platforms providing services not only to small and large business, but also to home usersall overs the world. A lot of the functionality that we have previously seen in desktop appli-cations is now being ported to online services, which has enabled users to do powerful thingsat a very small cost. Because all of this happens on the Internet, there has been a tremendousamount of research in real-time communication over the Internet, as well ways to leverage net-works and operating software to achieve maximum efficiency in terms of computational powerand latency. This, in conjunction with the steady advances in sensor and microprocessor designhas lead to a new paradigm in the interaction between real-world objects, called the Internet ofThings.

The Internet of Things stems from the idea that you can put a small computing deviceand sensors on every object, be it vehicle, an animal or even a human being. You can thenleverage the existing internet infrastructure to make these objects communicate between eachother, creating complex networks of heterogeneous objects. One example of this phenomenonis controlling your home appliances such as a washing machine or a thermostat from yoursmartphone or tablet. You could also put microchips on a flock of birds and have sensorsstream real-time data of their movement patterns to study their relation with weather changes.The possibilities are truly limitless and that is why it is important to investigate how much wecan achieve in that area.

1.2 Aims and ObjectivesThe aim of this project was to build a robotic vehicle, equip it with internet capability andcreate several means of controlling it, thus creating an Internet of Things scenario. Being ableto move the car in a sensible way using hand gestures while at the same time keeping thecommunication latency as minimal as possible were my two primary objectives. I also set outto build a unified web interface, which would enable anyone in the world to control my robotand receive live video feedback as long as they have internet connection.

6

Page 9: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

Figure 1.1: Project overview.

1.3 Report structureThe report is divided into the following chapters:

1. Introduction: This first chapter already outlined the motivation behind the project andthe main objectives that were set at the beginning.

2. Background and Research This chapter presents the research and the background knowl-edge needed to complete the project. Each of the hardware and software components areexplained without going into too much detail of how the pieces fit together.

3. Design: The chapter gives an overview of the interaction between the different compo-nents of the system, and reasoning behind some of the design decisions.

4. Implementation: The chapter explores the development decisions taken, and gives highlevel implementation details and an overview of the more critical pieces of code.

5. Testing and Evaluation: This chapter shows the approach taken in order to test the robotand summarizes the overall success of the project.

6. Conclusion: This chapter provides a summary of the work done, including achieve-ments, difficulties and how the project can be evolved.

7

Page 10: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

Chapter 2

Background and Research

This chapter presents the research and the background knowledge needed to complete theproject, including all hardware and software components. Having explored a large amount oftechnologies, in order to avoid having this chapter grow out of proportion, some of the topicsonly provide a high-level explanation that should support the understanding in later chapters.

2.1 Robot chassisGiven the time constraints it was not feasible to try and build a chassis from scratch. Forthis reason several robotic platforms available on the market were investigated. The majordifferences between them and the chosen platform are shown below.

2.1.1 Steering PrinciplesThe first consideration in choosing a chassis was the steering principle employed:

1. Ackerman Steering: This is the most known type of steering today, used by most mod-ern cars. In this type of steering we have two wheels which can change their directionto achieve the desired turning angle. This model turned out to be the least maneuverableout of the three proposed.

2. Omni Wheels: Omni-directional wheels are the most maneuverable method, but theyare also very expensive and can be sometimes unreliable in bad terrain, which wouldhinder testing of the controls.

3. Differential Steering: In this type of steering, turning is accomplished by driving thetwo sides of the car at different speeds. This was the chosen principle, as it allowed bettermaneuverability than the Ackerman model while preserving simplicity and keeping thecost of the chassis minimal.

2.1.2 Size and PowerThe chassis would need to be large enough to accommodate a microcontroller, a camera mod-ule, a networking module and space for sensors. Moreover, the robot had to be tested in arough terrain that is far away from where the human controller is, as would be the case withmost real-world applications of remote vehicle control.

8

Page 11: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

2.1.3 The chosen platform

Figure 2.1: The Wild Thumper robot chassis. [5]

The Dagu Wild Thumper was chosen for this project. It is a four wheel differential-drive chassisand each side has two motors wired in parallel, meaning that only two motor control channelsare needed in order to drive the robot. Also the motors are powerful enough to carry 5kg ofweight, allowing for future extensions.

2.2 Motor Controller

Figure 2.2: The T’Rex Motor Controller. [7]

To drive the motors there is an electronic circuit called H-Bridge. All motor controllers avail-able have at least one of these circuits and all that is needed to know about them is that they

9

Page 12: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

supply input pins for speed, direction and sometimes electronic braking that can be specifiedin software. For this project, a dual H-Bridge was needed to drive the left and right motorchannels. Finally, the motor controller needed to be able to supply at least 14A of current asspecified in the Wild Thumper chassis documentation. As a result, the TRex Motor controllerfrom DAGU was chosen.

2.2.1 The T’Rex Motor controllerThis device can handle currents in excess of 40A per motor, has two motor channels and alsoseveral means of data communication with other devices, including I2C, USB and pins forBluetooth and Radio modules. Being a fully programmable Arduino-compatible board, it couldhave been used as the main robot microcontroller. Unfortunately, the capability to host anoperating system and a web server was required and for this reason another device was used,which will be described in the next section. For more information on how the two devices talkto each other, consult Appendix B.

2.3 Raspberry Pi

Figure 2.3: Raspberry Pi 1 Model B+ [4]

Raspberry Pi is a credit-card sized computer, which has the ability to interact with the outsideworld. In addition, it can talk to the T’REX Controller either via USB or I2C link. It also haswireless connectivity which is essential for this Internet-Of-Things project. Most importantly,it can run a Linux-based operating system, which means that a web server can be hosted on itand many available applications and APIs could be used to aid this project. A vanilla version ofthe Raspbian operating system was used, with the only modification being the enabling of theI2C kernel module which was disabled by default. There were other alternatives to this device,but none offered any particular benefits.

2.3.1 The Raspberry Pi Camera ModuleThis 5mp camera module, capable of 1080p video, which connects directly to the Raspberry Pivia the Camera Serial Interface. The camera setup is explained in Appendix A.

10

Page 13: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

Figure 2.4: Raspberry Pi Camera Module [6]

2.4 BatteriesIn order to have a fully mobile platform, the battery requirements of the system had to beinvestigated. The Raspberry Pi requires steady voltage of 5V and draws about 1A of currentwith a camera and a wireless dongle attached to it. Therefore, in order to run it for 5 hours, a 5Vbattery with capacity of 5000mAh is needed. The T’rex Motor controller has a built-in voltageregulator and can take 6V SLA, 7.2V NiMh and 7.4V LiPo batteries. Having investigated thestall and no-load currents of the motors, it was concluded that a 5000mAh battery would lastabout an hour, depending on the terrain.

2.5 Leap Motion Controller

Figure 2.5: Leap Motion controller [1]

Leap Motion controller is a motion-sensor device which uses infrared stereo cameras as track-ing sensors. Images are streamed to the computer via a USB link and then a number of mathe-

11

Page 14: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

matical operations are performed on the raw sensor data to transform it into motion data. Thisdata is then exposed through an API. Information about it follows in the next section.It is important to mention that the Leap Motion controller is not a plug-and-play device in thesense that it does require software to be installed, which starts a service called the Leap Service.This service acts as an API endpoint, meaning that all calls to the API are through that service.

2.6 LeapJSThe Leap Motion controller exposes its functionality in several possible languages. For thisproject I decided to use the Javascript API called LeapJS. Javascript is the language of the weband all modern browsers can run Javascript code. This meant that all the gesture control code Iwrite could run in the browser and the users of the robot would not need any preliminary setup- all that is needed to gesture-control the robot would be an internet connection and a LeapMotion controller attached to their computer.

Figure 2.6: The Leap Motion controllers view of your hands [3]

The Leap Motion system recognizes and tracks hands and fingers. It employs a right-handedCartesian coordinate system with origin at the top of the Leap device. The y-axis is verticalwith values increasing from bottom to top. The x-axis and the z-axis lie in a horizontal plane,parallel to the device’s surface. The z-axis has values increasing with the distance between theuser’s hand and the origin.

Figure 2.7: The Leap Motion coordinate system. [2]

12

Page 15: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

The controller takes images in a rate of between 20 to 100 frames per second. For everyframe, it creates a Javascript Frame object, which we can inspect by querying the Leap Service.This Frame object contains all the motion data in a JSON format. Particularly it contains dataon the position and orientation of the hands and the fingers that are present in the device’s fieldof view in its coordinate system. As it will be shown later, this is all the data that is needed tocreate a responsive gesture-control interface for the robot.

2.7 The web serverAs mentioned earlier, the robot hosts a web server, which serves a web interface to the endusers. There are two requirements that the server needs to fulfill.

First of all, the server needed to be able to handle requests in real time, without noticeabledelay between requests. This is very important, because at least 20 frames of data per secondneed to be send in order to have a fluid response from the robot. Most standard server technolo-gies such as PHP with Apache have a synchronous model, which means that when a requestis sent, the server must process it and send an acknowledgment to the client before anotherrequest can be processed. This is unacceptable and can lead to a lot of latency between actionfrom the controller and a reaction from the robot, as the requests add up. Therefore the choiceof server was limited to one which processes requests asynchronously.

Figure 2.8: Synchronous vs asynchronous server model. The advantages of the latter can beeasily seen as in this example as there is no wait time in between requests and they are processedas they arrive. In the time that the first model completed 2 requests, the second model managedto complete 3 requests even with more time between requests.

Secondly, there was a need for a stack of technologies that could handle the communicationflows, as creating these would be beyond the scope of this project. Therefore the server platformalso needed to have a large community with many libraries.

13

Page 16: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

2.7.1 Node.jsNode.js was the server technology of choice. Node.js is a platform built on top of GoogleChrome’s JavaScript runtime. It uses an event-driven, non-blocking I/O model. What thismeans is that all the functionality is implemented using callback functions which are calledwhen a certain event happens. It also follows the asynchronous model mentioned above. Theevent-driven model allowed for the creation of event callback function that describe what hap-pens when a control message was received. It is written on top of Javascript and Javascriptcode is used to write Node.js programs.

2.7.2 Express.jsExpress.js is a minimalistic web application framework written on top of Node.js. It was usedto create the HTTP web server that served the web interface.

2.7.3 Socket.ioSocket.io is a Node.js library that enables real-time bidirectional event-based communication.It was used to handle the transfer of data between the robot and the controllers by creating asocket server that sits on top of the web server and all clients connected to that server to startcontrolling the car.

2.7.4 JadeJade is a Node.js template engine for writing HTML code. A template engine was useful inthis case, because it allowed me to use the same front-end code for the web interface for boththe PC and the smartphone by just changing a few variables.

2.7.5 Johnny-fiveJohnny-five is a Javascript robotics framework that has Node.js support. It was used for com-munication between the Raspberry Pi and the T’Rex Motor Controller. Any of the receivedcontrol messages on the server were transformed into Arduino-compatible commands that werethen relayed to the T’Rex Motor Controller, via a USB link, using the Firmata protocol.

14

Page 17: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

Chapter 3

Design

The chapter gives an overview of the interaction between the different components of the sys-tem and reasoning behind some of the design decisions.

3.1 System overviewThe main use case of this system is having a user send control messages from a web interface,which is served by a web server that is hosted on the Raspberry Pi, attached to the robot. Theweb interface allows several means of control as well as video feedback. The web server waitsfor control messages and after it processes them, it relays them to the motor controller, whichinitiates movement from the car. All components are designed to be loosely coupled and canbe replaced by similar technologies as long as they support the same communication protocols.

Figure 3.1: High-level system overview

3.2 Hardware designThe diagram below illustrates the complete hardware design of the robot and shows the inter-actions between the different components. All hardware design decisions were made basedon the research outlined in the previous chapter, ensuring that the system was as efficient andcost-effective as possible.

15

Page 18: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

Figure 3.2: Illustration of all hardware components of the robot wired together

To begin with, two rechargeable battery packs were used, one to power the T’Rex MotorController and the motors, and one to power the Raspberry Pi. Even though using one batterywas possible and was a more elegant solution, sometimes the motors would pull the batteryvoltage down and cause the Raspberry Pi to reset. Also, having two power sources meant alarger uptime for the robot. One 5V 7800mAh power bank was attached via a USB To MicroUSB cable and one 7.4 5000mAh LiPo Battery was attached to a power switch. Having a switchconnected to the motor controller meant that everything except for the Raspberry Pi could beturned off, which would allow easier debugging and prototyping later on. The motor controllerwas mounted on the robot chassis and the left and right control channels were wired to aprototyping board connecting the left motors and the right motors in parallel. Raspberry Pi wasalso mounted on top of the chassis, and it had a camera module and a wireless dongle attachedto it, providing the desired internet connectivity. It also connected to the motor controller viaa USB to mini USB cable providing a fast and reliable connection between the two via theFirmata protocol, mentioned in Appendix B.

3.3 Software designThe main consideration here was ensuring that there is a minimal overhead added to latency be-tween action from the user and reaction from the robot. Another important aspect was makingsure that the system is as easy to use as possible without any preliminary setup.

The two largest components are the Web interface and the Web server. The web server

16

Page 19: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

Figure 3.3: Software design overview

hosts a socket server which listens for control messages and contains the code that updatesthe actuators of the robot. It is also a content delivery system in the sense that it deliversthe user interface depending on device type. The web interface sends control messages to theserver, which are created by one of the Controller interfaces. This include Keyboard control,Phone control and Leap control. Overall there is also a lot of feedback about what the issuedcontrol commands were as well as live video feedback from the robot. A socket client opensconnection to the socket server to start streaming the control messages.

3.3.1 Moving control code to the web browserThe approach taken to remove user setup was to move all functionality into the user’s webbrowser, eliminating the need for any prerequisites to using the project, other than an internetconnection. To achieve this, all the control code was written in Javascript and stored on theRaspberry Pi. When a user connected to the web interface, the web server would deliver all thecontrol code into their browser. As an added benefit, this method also allowed me to leveragethe computational power of the user’s device as it can be easily assumed that their computerwould be faster than the Raspberry Pi.

3.3.2 Web interfaceTo create the interface, the main control primitives had to be established. It was decided thatusers would be able to control the speed, direction and braking of the robot. The web interfacewas then designed to provide video feedback as well as the current values of those primitives.

17

Page 20: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

Figure 3.4: Web user interface design

3.3.3 Web serverThe web server needed to have several properties. First of all, it needed to load all web interfacecode on request. Secondly, it had to support content delivery based on device type. This wasimportant because at the beginning of the project it was decided that users should be able touse both their computer and smartphone to connect to the robot’s interface. Finally, the serveralso had to support bidirectional real-time communication with the web interface, to facilitatethe exchange of data.

3.3.4 Communication ProtocolsCommunication between the web interface and the web server happens by transferring mes-sages via sockets. For this purpose, a socket server was hosted on top of the web server. In thisway, when a user connects to the web interface, the client code already knows the location ofthe socket server and opens a connection to it. When the user initiates a robot control action,a control message is formed containing values for all control primitives such as speed, direc-tion and brakes. Messages should then be sent in a steady rate to give the illusion of fluidityof control. It is important to find a balance in the amount of messages. Too many and theRaspberry Pi might not be able to handle them, while having too little might make the systemunresponsive. Messages are generated via one of the controller interfaces explained in the nextsection.

3.3.5 Control messageControl messages needed to be small, because large amount of them would be be sent over theInternet. Each message contains the direction, which is binary forward—backward, left motorspeed and right motor speed, which are enough to describe the velocity vector, as well as abinary value True—False for the electronic brakes.

18

Page 21: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

3.4 Controller InterfacesController interfaces were divided so that code for each device type was in a separate Javascriptfile and common code such as global variables and socket connections is kept in a single com-mon.js file.

3.4.1 Controller loopEach interface modifies the variables that form a control message. The control interfaces share acontroller loop which sends a stream of control messages at a steady framerate. This means thatthe switching between the interfaces will be transparent to the user. In fact, using this approachone could control the robot using several interfaces simultaneously without any conflicts.

3.4.2 KeyboardThe simplest kind of control was using the W, A, S, D keys on the keyboard to represent theFORWARD, LEFT, BACKWARD, RIGHT directions respectively. This is very common invehicle-driving computer simulations and games. Being very easy to implement and test, thismethod was used the majority of time for debugging and testing functionality related to thecommunication flows.

3.4.3 Phone orientationModern smartphones have devices such as gyroscope, accelerometer and compass that col-lectively give information about the physical orientation and movement of the hosting device.

Figure 3.5: Phone Orientation control

19

Page 22: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

3.4.4 Gesture control: using hand orientationWhat if instead of the mobile phone orientation we used the orientation of our hand? We coulduse the same algorithm, but apply it on the data on the palm orientation that comes from theLeap Motion controller.

Figure 3.6: Hand orientation control

3.4.5 Gesture control: using hand positionThe idea here is to use the Leap API to get the hand position in 3D world with respect to theLeap Motion controller(the origin). The further away the hand is from the origin, the larger thespeed of the robot, up to a predefined maximum. This maximum is the radius of a circle whichlies in the XZ plane with center being the origin of the motion controller. This also gives us away to measure direction where the Z-axis describes the forward-backward movement, whilethe X-axis describes the left-right movement.

Figure 3.7: On the left we can see two positions of a hand in 2D space and on the right we seehow they are translated to velocity vectors internally

20

Page 23: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

Chapter 4

Implementation

The chapter explores the development decisions taken, and gives high level implementationdetails and an overview of the more critical pieces of code.

4.1 Robocar setupAfter assembling the robot components as specified in the hardware design diagram, there werea few things that needed setting up. First of all, the Raspberry Pi needed to be loaded with aLinux-based operating system and have all the client and server libraries installed. As codewas developed and tested, it was deployed to the Raspberry Pi via a source control system. Theactions needed to initiate the system were booting the Raspberry Pi and starting the Web serveronce it was developed. The robot could then be controlled by connecting to the web interface.

4.2 Web interfaceIn order to be able to control Robocar, the user needs to be connected to the web interface.When that happens, data is collected from all possible input devices such as the keyboard andthe Leap motion controller and it is streamed using a socket connection to the Raspberry Pi, toinitiate motor movement. The web interface provides camera feedback as well as what valuesare being sent to the robot.

4.2.1 Adjusting the input data for differential steeringBefore we go into the control implementations, it is important to mention the kind of dataneeded to control a differential steering car. As mentioned in Chapter 2, most cars use theAckerman steering model. With this model we can have two input channels - throttle channelwhich gives us the speed and steering channel which gives us the angle at which the frontwheels rotate to achieve turn. This is the most sensible model to describe the velocity vectorof a car and for this reason the input data from the various controllers is modelled in this way.Since our robot uses differential steering, the data needed to be converted. The approach takenis to take the throttle data and set that to be the maximum speed for both motors. The left andright motor speeds are both functions of the steering data - if we do a ’full’ turn left, this meansthat we would reduce the left motor speed to 0. If we did a ’half’ turn left, this means that we

21

Page 24: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

reduce the left motor speed to be half the right speed. This gives us a method to calculate theratio between the left and right motor speed as well as the maximum speed.

4.2.2 Keyboard controlWith keyboard control, the car moves at a fixed speed, so the throttle is always the same. Theimplementation leverages the research that was made about differential driving - in order toturn left or right, we stop moving the left or right motors respectively. To move forward orbackwards we supply the same amount of current to both the left and right motors.

Here is pseudocode of the implementation:

{speed = fixed_throttle_value;enum DIRECTIONS={FORWARD, BACKWARD, LEFT, RIGHT}start listening for user key pressesif a user presses a keyswitch (key_press)

case ’W’: direction = FORWARD, left = 1, right = 1;case ’S’: direction = BACKWARD, left = 1, right = 1;case ’A’: direction = LEFT, left = 0, right = 1;case ’D’: direction = RIGHT, left = 1, right = 0;

default: do nothingleftmotorspeed = rightmotorspeed = 0;if (left) leftmotorspeed = speed;if (right) rightmotorspeed = speed;moveMotors(direction, leftmotorspeed, rightmotorspeed)

}

4.2.3 Controlling from the phone - using accelerometer dataJavascript provides an event called ’deviceorientation’ that collects information about the ori-entation and movement of the device. We can get the rotation angle of the plane which isparallel with the device’s screen. We can then use the tilt of the phone forward or backward asour motor speed and the tilt left or right as the amount of left or right turn we take. In otherwords, using what we know about differential steering, we can derive the left and right motorspeed by using the vertical tilt as the speed cap, and the horizontal tilt as the ratio of the leftand right motor values.

Here is pseudocode of the implementation:

{start listening for device orientation changesif a device changes orientation

steer = orientationdata.gamma;throttle = orientationdata.beta;normalize orientation data with motor speed limits{speed = normalized throttleratio = normalized steer

22

Page 25: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

direction = front or backward tilt}

if (steering left)rightmotorspeed = speedleftmotorspeed = speed - speed * ratio;

else if (steering right)leftmotorspeed = speedrightmotorspeed = speed - speed * ratio;

else leftmotorspeed = rightmotorspeed = normalizedspeed;

moveMotors(direction, leftmotorspeed, rightmotorspeed)

}

4.2.4 Gesture control - using the Leap Motion controllerThe next two subsections show how the Leap data is used to achieve gesture control. To makethat data available, all that happens is the code establishes a connection to the local Leap serviceand then starts to poll it for frames of data.

4.2.5 Gesture control - using hand orientationUsing hand orientation extends on the idea of using the phone orientation. We can apply thesame algorithm, but this time the plane is parallel with the palm of the hand. In the Leapcoordinate system, we can use the rotation of the hand around the x-axis to derive the rationbetween left and right motor speed and we can use the rotation of the hand around the z-axis toderive the speed cap of the both motors.

Here is pseudocode of the implementation(note the similarity with the phone orientationpseudocode:

{start listening for leap data framesif the hand changes orientation

steer = leapdata.hand_pitch_angle();throttle = leapdata.hand_roll_angle();normalize orientation data with motor speed limits{speed = normalized throttle;ratio = normalized steer;direction = front or backward tilt

}

if (steering left)rightmotorspeed = speedleftmotorspeed = speed - speed * ratio;

else if (steering right)

23

Page 26: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

leftmotorspeed = speedrightmotorspeed = speed - speed * ratio;

else leftmotorspeed = rightmotorspeed = normalizedspeed;

moveMotors(direction, leftmotorspeed, rightmotorspeed)

}

4.2.6 Gesture control - using hand positionSection 3.4.5 explains the main ideas behind this method. What we do is we inspect each Leapdata frame and find the position of the hand in the Leap’s coordinate system. We can find thespeed cap by calculating how far the hand is from the origin. Then, we can derive the ratiobetween the left and right motor speed by calculating how far left or right the hand is from theorigin.

Here is pseudocode of the implementation:

{start listening for leap data framesif the hand changes position

handX = hand.XhandZ = hand.Zcalculate direction with relation to how much the hand isextended away from the origin in the XZ-planecalculate speed by finding how far the hand is from the origincalculate ratio between left and right motors by using the calculateddirection.normalize the values with the motor speed limits

if (steering left)rightmotorspeed = speedleftmotorspeed = speed - speed * ratio;

else if (steering right)leftmotorspeed = speedrightmotorspeed = speed - speed * ratio;

else leftmotorspeed = rightmotorspeed = normalizedspeed;

moveMotors(direction, leftmotorspeed, rightmotorspeed)}

4.2.7 The resultFinally here is a picture of what the final version of the web interface looks like.

The video feedback is placed in the middle. There are two widgets to the sides, whichdisplay the direction and the speed of the robot. The particular values are also shown in a tablebelow. If the same web page was loaded from a smartphone, only the camera feedback wouldbe displayed.

24

Page 27: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

Figure 4.1: The final look of the web user interface

4.3 Web serverEven though there were several components to the web server, they all shared the same codebase and started simultaneously once the server was started.

4.3.1 Content delivery systemTo complete the delivery of static files, such as all the controller Javascript files to the end-user, an HTTP server was needed. I used the implementation provided by the express.js webapplication framework. The server was configured to inspect the HTTP user-agent string whena request was made, so that it could determine the type of device that was making a request.All requests to the server were routed to a folder on the Raspberry Pi that contained all thefront-end code.

25

Page 28: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

4.3.2 Socket serverOnce the HTTP server was running, users could connect to the Web interface. After that, asocket server was attached on a different port. The server then modifies the Javascript code thatgets delivered to the client to include the address and port of the socket server. This sets up theenvironment for the bidirectional data streaming between web server and the web interface.

4.3.3 Receiving control messagesUpon receiving a control message, the server creates a timeout which expires in 100ms. Ifthe server receives a command within these 100ms, the timeout is reset. What this does is itprevents the robot from crashing in case the connection breaks and the client stops sendingcommands. Then the time between the new and the last control message is measured. If thetime is more than a predefined number, the control message is discarded at the cost of smallfluidity lost in movements. The reasoning behind this is that we do not want to process toomany requests, because the Raspberry Pi might not be able to handle them.

4.3.4 Processing the control messagesIf a control message gets through for processing, we enter a the robot actuator control routine,which is what makes the robot move. Here, we first inspect the control message and strip offthe control primitives into variables, such as left motor speed, right motor speed, brakes anddirection. These values are then sent to the robot via a USB link, using the johnny-five library.

4.3.5 PseudocodeHere is a pseudocode of the server:

{1. Start HTTP server2. Attach socket server on top of the HTTP server3. Start listening for control messages4. Receive a control message4.1 Expire any old timeouts and start a timeout4.2 If message was received TIME milliseconds after the previous

message, discard it.4.3 Extract control primitives from the message and send

the values via a USB link to the robot.5. If the timeout expired, send a stop command to the web

server.

}

26

Page 29: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

Chapter 5

Testing and Evaluation

This chapter shows the approach taken in order to test the robot and summarizes the overallsuccess of the project.

5.1 ApproachThe testing approach was to perform black-box testing to each major component by testingthe API endpoints manually, either from the command-line or from small fragments of codeand then inspect the results. Given that the project represents a scenario, the overall systemwas tested by performing various modifications to the Internet-of-Things scenario. The robotwas placed in various remote locations with different connectivity and its functionality wastested over the internet from a number of different devices. Testing was executed in an iterativemanner, meaning that every time a new component was being created, the overall system, aswell as the isolated component were tested.

5.2 ResultsThe results from testing lead to the discovery of bugs. For example, there was one issue wherethe ping between any device and the wireless dongle on the Raspberry Pi was intermittentlyvery high, causing the system to become very unresponsive. It turned out that when the Pibattery voltage drops to a certain point, the dongle could not function correctly. More thantwo weeks were spent debugging the code, thinking the problem was in the implementation.Another example would be that at some point in the development of the project, the car wouldintermittently stop responding to any incoming messages and would subsequently crash. Aftersome investigation it was discovered that this only happened on rough terrain. As it turnedout, the battery for the motor controller was reaching the end of its lifetime, and every time themotors started to draw more current, because of the rough terrain, the motor controller wouldreset itself. Towards the end of the project the car would only work while it was on top of astand, with the wheels rotating freely in the air. The problem was easily fixed by replacing thebattery, but it was also very educational of how hard it can be to debug such systems.

27

Page 30: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

5.3 EvaluationThe first notion of success in this project was that by hard-coding a few time calculationswithin the code it was empirically measured that the overhead on top of the Internet’s latencyfor the overall response time of the system was less than 10ms. In my tests, no matter what theinternet connection was, the overall response time from the robot was always less than 200ms.Assuming good internet connection, here is a table of some estimated response times if the useris in the Manchester, UK and the robot is placed in various locations around the world.

Location PingUS-East (Virginia) 128 ms

US-West (California) 182 msUS-West (Oregon) 195 ms

Europe (Ireland) 52 msEurope (Frankfurt) 58 ms

Asia Pacific (Singapore) 222 msAsia Pacific (Sydney) 372 ms

Asia Pacific (Japan) 302 msSouth America (Brazil) 249 ms

Table 5.1: As we can see 10 ms overhead is very small and almost anywhere in the world wecan achieve fast responses from the robot, provided there is good internet connection. This datawas estimated using http://www.cloudping.info/.

The second notion of success comes from user feedback. I managed to get feedback fromseveral users of my system and the agreement was that the interface was intuitive and the usersdid not need any help to start controlling the robot.

The final notion of success comes from the fact that all aims set at the beginning of theproject were achieved. Moreover, the main design objectives in terms of little overhead andease-of-use were also achieved.

28

Page 31: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

Chapter 6

Conclusion

6.1 AchievementsThe primary objectives of this project, stated in section 1.2, were achieved. A robotic vehiclewas created and it was equipped with internet functionality. The robot hosted a web server thatthe end-users - the controllers, would connect to in order to control the device. For this reason, aweb interface was also created. The interface allowed several means of interactivity, includingkeyboard control, gesture control using the Leap Motion controller, and other motion sensingdevices such as the gyroscope and the accelerometer present on modern smartphones. Thelatency between controller action and robot reaction depends completely on the quality of theinternet connection between them, meaning that all the communication adds little to no over-head, which was one way to measure the success of the project. A consequence of the projectwas also devising a sensible way to use hands to control a 4-wheeled vehicle. Overall, thisproject provided great experience as a complex system had to be designed and implemented,using a large number of technologies and APIs.

6.2 Taking the project furtherMany parts of this system can be reused for the creation of other complex Internet-of-Thingsscenarios. As it currently stands, the web server supports having multiple clients controllingthe robot simultaneously. This could be extended to add support for multiple robots by addingsome context switching in the web interface. Another example would be using the same com-munication flows, but instead of wiring to a robot, we could wire to sensors on our washingmachine, refrigerator or drier and by changing the web interface we could turn this into ahome-automation project. Moreover the gesture control user interface could be used in a vir-tual reality platform such as a car simulation computer game with good success.

6.3 Closing remarksIn general, all the objectives established at the beginning of the project were achieved. Theproject was a proof of concept of an Internet-of-Things scenario and as such this paper couldbe useful to anyone wishing to explore having two real-world objects talking to each other overthe Internet.

29

Page 32: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

Bibliography

[1] Leap Motion. How does the leap motion controller work? http://blog.leapmotion.com/hardware-to-software-how-does-the-leap-motion-controller-work/,2014.

[2] Leap Motion. Leap coordinate systems. https://developer.leapmotion.com/documentation/csharp/devguide/Leap_Coordinate_Mapping.html, 2015.

[3] Leap Motion. Leap motion overview. https://developer.leapmotion.com/documentation/javascript/devguide/Leap_Overview.html, 2015.

[4] Lucasbosch (Own work). Raspberry pi 1 model b+. [CC BY-SA 3.0(http://creativecommons.org/licenses/by-sa/3.0)], via Wikimedia Commons, 2014.

[5] www.pololu.com. Dagu wild thumper 4wd all-terrain chassis, black, 75:1. https://www.pololu.com/product/1567, 2014.

[6] www.sparkfun.com. Raspberry pi camera module. https://www.sparkfun.com/products/11868, 2014.

[7] www.sparkfun.com. T’rex robot/motor controller. https://www.sparkfun.com/products/12075, 2014.

30

Page 33: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

Appendix A

Camera setup

A.1 PrerequisitesAssuming we have a Raspberry Pi with loaded with the Raspbian operating system, the follow-ing packages need to be installed:

mjpg−s t r e a m e r , l i b j p e g 8−dev , imagemagick

A.2 Starting the streamOnce the prerequisites are installed, I have created the following bash script which takes careof starting the camera stream(note that the script assumes you have installed the mjpg-streamerpackage in your home directiory):

e x p o r t LD LIBRARY PATH= \/ home / mjpg−s t r e a m e r / mjpg−s t r e a m e r−e x p e r i m e n t a l \/ home / mjpg−s t r e a m e r / mjpg−s t r e a m e r−e x p e r i m e n t a l / m j p g s t r e a m e r \−o ” o u t p u t h t t p . so −w \/ home / / mjpg−s t r e a m e r / mjpg−s t r e a m e r−e x p e r i m e n t a l /www” − i \” i n p u t r a s p i c a m . so −x 640 −y 480 −q 8 −f p s 20 −ex n i g h t ”

This starts a stream on port 8080 with resolution of 640x480 and limits and quality and theframerate so that the Raspberry Pi can handle the live streaming without a noticeable delay.The stream can now be embedded in any web page.

31

Page 34: Robocar: An IoT gesture-controlled robotstudentnet.cs.manchester.ac.uk/resources/library/3rd-year-projects/... · Abstract Robocar: An IoT gesture-controlled robot Author: Atanas

Appendix B

Communication between the Pi and theT’Rex controller

There are two ways to transfer data between the two devices. One was via the I2C bus presenton both and the other was via USB. Both interfaces are serial and have their own merits.

B.1 I2CI2C uses two bidirectional open-drain lines, Serial Data Line (SDA) and Serial Clock Line(SCL), pulled up with resistors. This was the initially chosen method of communication be-cause it is simple to use and employs a master-slave configuration. What this means is thatif we were to make the Raspberry Pi our master device, I could control several slave devicesfrom it, allowing for greater extensibility of the project. One issue I faced using this methodwas that I2C devices can slow down communication by clock-stretching SCL. Without goinginto too much detail, this meant that if a device receives data too quickly it would try to slowdown communication. The drivers on the Raspberry Pi did not support clock-stretching, so thePi and the T’Rex would go out of sync as a result. My solution to this problem was hacking theI2C kernel module on the Raspberry Pi by reducing the baudrate of the bus, thus eliminatingthe case where a device cannot handle the incoming data quickly enough. This worked out as atemporary solution however, because one of the pins of the I2C interface on the T’rex eventu-ally burned and I was forced to look for another method to make the two boards communicate.

B.2 USBIt turned out that one can connect the two devices via USB. Communication happens by loadingthe Standard Firmata sketch on the T’Rex(it is an Arduino compatible board). Then there areseveral client libraries which allow access to the board features. This method has much fasterbaudrate than I2C so it improved the overall performance, but at the cost of limited potentialextensibility.

32