ece532: laser-tracking remote control car

21
ECE532: Laser-tracking Remote Control Car Group Report Xiang Li Joseph Lucas Charles Grandfield 1

Upload: others

Post on 18-Dec-2021

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: ECE532: Laser-tracking Remote Control Car

ECE532: Laser-tracking Remote Control Car Group Report

Xiang Li Joseph Lucas Charles Grandfield

1

Page 2: ECE532: Laser-tracking Remote Control Car

Table of Contents Table of Contents 1. Overview

1.1 Background 1.2 Motivation 1.3 Goals 1.4 Systems Block Diagram 1.5 Brief Description of IP

2. Weekly Progress 2.1 Week One

2.1.1 Proposal 2.1.2 Progress

2.2 Week Two 2.2.1 Proposal 2.2.2 Progress

2.3 Week Three 2.3.1 Proposal 2.3.2 Progress

2.4 Week Four & Five 2.4.1 Proposal 2.4.2 Progress

2.5 Week Six & Seven 2.5.1 Proposal 2.5.2 Progress

3. Outcome 3.1 Achievements 3.2 Difficulties 3.3 Possible Improvements

4. Detailed Description of IP Blocks 4.1 HDMI_out

4.1.1 Image Filtering 4.1.2 Image Processing

4.2 Software 4.2.1 Geometric Algorithm 4.2.2 Control Algorithm

5. External Hardware Components 5.1 Remote Controller 5.2 Buffer and System Connections

5.2.1 Use of Buffer 5.2.2 GPIO Connections

6. Description of Design Tree 7. References

2

Page 3: ECE532: Laser-tracking Remote Control Car

1. Overview

1.1 Background Interaction between machine and real­time environment has always been an area of great interest in the field of digital system design. Some of the most common seen examples are digital devices that are used for facial detection, voice detection and smart response (e.g. Google Now, Siri, Figure 1.1.1a), chemical component detection and feedback control (e.g. Quality control systems used in factories), light sensors and brightness control (e.g. Smartphone screen brightness adaptation to ambient light intensity, Figure 1.1.1b), and ambient physical environment detection (e.g. Automatic floor cleaner robots, Figure 1.1.1c). From the examples stated above, we can see that great efforts have been made in the research and application of digital systems capable of interacting with the real­time environment in an accurate and timely fashion.

a. b. c.

Figure 1.1.1 A list of digital systems capable of responding to real­time environment (voice, light intensity and physical blockings).

To realize the function of digital system actively reacting with external environment, usually several components are required to accomplish such interaction. While this normally may not be easily implemented on a single chip the Xilinx Spartan­6 FPGA Board comes with the components and ports that allow the user to effectively interface with external systems and interact with the external environment. Some of these components are touch screens, video cameras and HDMI output screens. This allows the board to be used to extract external input, process this input to find the certain features from data acquired and output response and take action in response to a stimulus from the the external environment.

1.2 Motivation

3

Page 4: ECE532: Laser-tracking Remote Control Car

Inspired by the examples of digital systems described above, and having seen several projects that are previously made by students who had taken this course before, we decided to aim our project at involves graphics processing and timely response to ambient signal input.

One interesting idea is about making devices that possess tracking capabilities. The mechanism for such digital system to work can be demonstrated in the following diagram:

Figure 1.2.1 Mechanism for a system with tracking ability

1.3 Goals The goal of our design is to realise the function of a small remote control car that is capable of tracking a light spot in front of it. The signal input will be the light spot, shed by a laser pointer on the floor, and we will use the camera as the interface between the system and the external environment. The images captured by the camera will go through an image filter that we will implement inside the system. The image filter will extract necessary information from the images, including the pixels where the red dot and the green remote control car are. We’ll then use the information extracted to calculate the position of the dot and the car relative to the camera. Once the coordinates of the dot and the car are known, we can make decisions based on this coordinates. We can tell the car which direction to go in order to keep track of the light spot. The decision will be translated into digital signal and transmitted to a remote controller, with which we can indirectly control the movement of the car.

4

Page 5: ECE532: Laser-tracking Remote Control Car

Therefore, our goal is to complete the loop below:

Figure 1.3.1 Feedback loop for remote car control.

5

Page 6: ECE532: Laser-tracking Remote Control Car

1.4 Systems Block Diagram

1.5 Brief Description of IP The image below shows all the hardware IP used in our project. IP elements not included in XPS include the vmodcam_0, from the provided IP Project 1[1], and hdmi_out_0, based on the provided IP Project 3 [2] but modified with our custom IP.

6

Page 7: ECE532: Laser-tracking Remote Control Car

Details about hdmi_out Additionally, we used custom software on the MicroBlaze to control the car and the monitor output. This software uses the AXI bus to program addresses for communication between the vmodcam and hdmi_out, output settings for the hdmi_out, and threshold settings for the filters added to the hdmi_out. It also reads results from the location_sum modules in the hdmi_out IP and uses these values to locate the laser on the display. See below for further details.

7

Page 8: ECE532: Laser-tracking Remote Control Car

2. Weekly Progress In this section, we will list the milestones from our proposal and compare them to the progress we achieved.

2.1 Week One

2.1.1 Proposal Physical interface between car and Xilinx board constructed (not necessarily working

fully). First prototypes of image filtering and processing algorithms in software (e.g. a python program that examines bitmaps).

2.1.2 Progress

While our proposal envisioned us already having our RC car, we spend most of this week selecting the device to use. We first did some research into what type of remote control car we want to use. In order to avoid having too much analog component involved in our design, and to have the remote controller easily connected to our FPGA board, we decided that we are going to use remote controllers with only four buttons ­ forward, backward, right and left.

The first car we found on the internet was a remote control car with USB connection that can be connected to the computer, and we can control the car via software (Figure 2.1.1a). This model does give us some advantage, given that there are USB ports on the board, and we can use those ports to send information directly to the controller. However, the challenge for this option is decoding the signal sent via the USB port and being able to regenerate the same signal from the board.

a. b.

Figure 2.1.1 Remote control car models

The second model was much simpler. The controller consists of only four digital switches, each capable of giving a signal of 0 or 1. If we are able to pinpoint the points in the controller where the signals are given, we can have those points welded to a wire and further connected to the FPGA.

8

Page 9: ECE532: Laser-tracking Remote Control Car

We choose the second model in the end. We then began working on the physical interface, which put us a week behind schedule.

2.2 Week Two

2.2.1 Proposal Get RC car Xilinx board interface working. Demo would have car driving around,

controlled by FPGA switches.

2.2.2 Progress During the second week we began investigating filtering methods by implementing a

python program to process image files. We also had the controller welded to a buffer, and that set up the physical interface between the FPGA and the controller. More detailed explanation about the set up of the buffer and the controller can be found in section 5 of this report.

2.3 Week Three

2.3.1 Proposal Enable car and laser location tracking using camera. FPGA should locate car and laser

in image and print out their location and heading to console. Depending on difficulty of implementation the Video­Out IP may be included in our project to print filtered images and increase our debugging capability.

2.3.2 Progress During week three we demonstrated the car being controlled by the FPGA, a week after

we were scheduled to do so. We did not yet have any filtering being done.

2.4 Week Four & Five

2.4.1 Proposal Enable primitive interactive control with car using laser pointer ­ begin development and

implementation of car control algorithm for the MicroBlaze (likely to take less than 1 week).

2.4.2 Progress By week five we had our filters to implemented in software and capable of effectively

detecting the car and laser pointer as determined by trials conducted on photographs. Laser pointer based control of the car had not yet been implemented, and implementation of the FPGA image processing had not begun.

2.5 Week Six & Seven

2.5.1 Proposal Enable more nuanced control of car (e.g. different speeds depending on position of

laser). Use should feel intuitive.

9

Page 10: ECE532: Laser-tracking Remote Control Car

2.5.2 Progress By Week 6 image filtering was implemented in the FPGA and could be demonstrated

using the video out over HDMI. The image processing block (accumulating total x and y values of pixels passing filter) was prototyped but not tested The microblaze was not at this time able to acquire data based on our image filtering.

Week 7 had the image processing block completed, and the interface between the microblaze and our own HDL logic implemented. Software was written to control the RC car based upon data acquired from our filtering and processing logic.

10

Page 11: ECE532: Laser-tracking Remote Control Car

3. Outcome

3.1 Achievements We did not meet all of our original goals. What we did achieve was the following:

Image input and output Filtering of image to show car and laser pointer with variable thresholds Detection of laser and location on screen, not entirely reliable

Specifically, the data passed to the MicroBlaze sometimes goes to 0 for extended periods of time (seconds) before returning to the correct values

Passing of data from filter IP to MicroBlaze Car controlled by software

3.2 Difficulties All the serious difficulties encountered and surpassed in this project related to the hdmi_out IP and misunderstanding of its operation and signals. The first, which delayed work for several days, was causing our location_sum modules to output 0s for all of their outputs. We eventually determined that this was related to a difference in the hdmi_out between the resolution we had used it in for our IP Project and the resolution we were using it in for this project: namely, the hsync and vsync signals had different polarities depending on the resolution, unlike every other signal in the IP. The other serious difficulties was also related to the hsync and vsync signals, as well as the ve (video enable) signal. One was the assumption, mistakenly embedded in our design, that the hsync and vsync modules would be active (low, given the inverted polarity) at some point when ve was high. This mistake caused excessively high pixel locations, far off camera, as registers meant to track these simply continued accumulating until they overflowed. We also failed to realise initially that the hsync and vsync signal remained high for multiple cycles; this was causing our output registers to update with the reset values one cycle after loading in the real values. The outstanding difficulties relate to the calculations of on­screen locations of filtered objects and ­ possibly ­ to the passing of this data to the MicroBlaze. The unreliability described in Section 3.1 appears only for the red filter (identifying the laser), while the green filter (detecting the car) fails to ever correctly locate the car on the screen, often identifying it as being at coordinates 4000,4000 on the 640x480 screen, but never shows the extended periods of 0,0 outputs as the laser does. The asymmetry of the problems despite the symmetry of the malfunctioning hardware blocks is odd and not comprehensible.

11

Page 12: ECE532: Laser-tracking Remote Control Car

3.3 Possible Improvements The obvious improvements are to finish our original proposal. This would entail identifying and fixing whichever problem is preventing correct detection of the car’s location and zeroing the laser coordinates. Once this is completed, our software for getting the car to follow the laser could be tested; it is unknown if this would work but would provide an obvious direction for future work.

12

Page 13: ECE532: Laser-tracking Remote Control Car

4. Detailed Description of IP Blocks

4.1 HDMI_out As a base for the majority of our hardware, we used the IP Project we had previously worked on and implemented our filters and image processing internally. This decision was made primarily so that we could access the pixel data prior to output, along with accompanying signals such as ve (video enable), hsync, and vsync without having to transfer them to a separate pcore. In order to access the information output from our image processing IP, we modified the hdmi_out slave register interface to access a larger number of registers without having to modify the IP project’s AXI connections. The revised register interface is as follows:

Address Secondary Address

0x0 Address for secondary registers

0x0 Threshold Filter Max Values Bits 31­24: Maximum Red Bits 23­16: Maximum Green Bits 15­8: Maximum Blue

0x1 Threshold Filter Min Values Bits 31­24: Minimum Red Bits 23­16: Minimum Green Bits 15­8: Minimum Blue

0x2 Difference Filter Value Bits 7­0: Minimum difference between green and other colours

0x3 Total X coordinates of pixels which passed the threshold filter from the first camera

0x4 Total X coordinates of pixels which passed the threshold filter from the second camera

0x5 Total Y coordinates of pixels which passed the threshold filter from the first camera

0x6 Total Y coordinates of pixels which passed the threshold filter from the second camera

13

Page 14: ECE532: Laser-tracking Remote Control Car

0x7 Total number of pixels which passed the threshold filter from the first camera

0x8 Total number of pixels which passed the threshold filter from the second camera

0x9 Total X coordinates of pixels which passed the difference filter from the first camera

0xA Total X coordinates of pixels which passed the difference filter from the second camera

0xB Total Y coordinates of pixels which passed the difference filter from the first camera

0xC Total Y coordinates of pixels which passed the difference filter from the second camera

0xD Total number of pixels which passed the difference filter from the first camera

0xE Total number of pixels which passed the difference filter from the second camera

0xF Base address of frame for output

0x1 Secondary register data access

0x2 Bit 0: High­active output enable Bit 29: Selects red filter Bit 30: Selects green filter

4.1.1 Image Filtering Our image filtering redirected the data read from DDR memory to multiple filters used to

isolate the car and the laser on screen. To detect the (green) car, we used a difference filter which compared the green element of the RGB data to the red and blue elements. A pixel passed the filter if green was larger than both the other elements by an amount set by software. The laser was detected by a series of threshold filters which ensured that each element was between a maximum and minimum value set by software. Typically, the maximum red was 0xFF and the minimum green and blue were 0x0; the filter would detect the ring of red immediately

14

Page 15: ECE532: Laser-tracking Remote Control Car

around the immediate centre of the laser, which showed as white and as such was too green and blue to pass the filter. As the result was circular, this worked for detecting the centre of the laser, as the pixels in the ring averaged around to the centre.

4.1.2 Image Processing After the filtering, the pass/fail signals from the filters (that is, the output of the difference

filter and the ANDed outputs of the threshold filters) are used to find the total of the X and Y values of all filter passing pixels. This data is later used in software to determine the mean locations on­screen of the passing pixels. The processing in hardware is done by the location_sum modules. An x counter increments on each pixel and resets on hsync while a y counter increments on hsync and resets on vsync. These counters are then accumulated into a register when the pixel in question passes the filter. The modules also records the number of pixels which pass the filter. Later by dividing the X and Y coordinates by the total number of passing pixels, software can determine the average location of the filtered object(s).

4.2 Software Our software can be divided into the following two parts: geometric algorithm and control algorithm.

The geometric algorithm serves to interpret the data received from the image filter, within the HDMI_out IP block, and translate the values into coordinates of the center of the car and the laser, which can be used in the control algorithm to make decision on the movement of the car.

The control algorithm serves to control the movement of the car based on the coordinates of the car and the laser. Its main tasks are finding out the heading of the car, and find out the direction in which the car should move so as to realize the tracking ability.

4.2.1 Geometric Algorithm For our demonstration our software found the pixel corresponding to the center of mass

of the laser and car by dividing the accumulated total X and Y values of pixels passing the filter by the number of pixels passing the filter. The Y value of this average was then compared to two and the car made to go forwards or backwards if they were high or low enough

Our original implementation for processing the image data had additional steps which were discarded due to the low reliability of the data which the microblaze acquired. First the vertical and horizontal angles from the cameras center line to the center of mass pixel average would be calculated. The focal distance was found by experimentally measuring the camera’s field of view and taking the tan of the avg x position over this image focal distance. From the two horizontal angles, the distance of the object was triangulated to give the object’s ­Z coordinate which would be used in generating the X, and Y positions of the object.

4.2.2 Control Algorithm After the geometric algorithm has given us the coordinates of both the car and the laser

with respect to the two cameras, we can start implementing the control algorithm that keeps the car within certain distance to the laser.

The overall control algorithm of the car can be demonstrated in Figure 4.2.1.

We first decide whether the car should move or remain still by calculating the distance between the laser and the car. If they are too close together, the car should remain still.

15

Page 16: ECE532: Laser-tracking Remote Control Car

If the laser is too far away from the car, then the car should move. However, the position of the car does not give us information on the current heading of the car. To obtain the heading of the car, we either compare the current coordinate with previously saved coordinates, or we may make the car move forward for a short distance, and find out the current heading of the car.

Once the heading is determined, we can find the angle between the car’s heading and the vector that points from the car to the laser. The dot product of this two vectors divided by the product of their length will indicate the cosine value of the angle (also, refer to The Law of Cosine in vector form), while the cross product of these two vectors indicates whether the laser is to the left side or the right side of the car’s current heading.

Once the above information is obtained, the car will move towards the laser, according to the decision making flow in the flowchart, and the software will constantly update the coordinates to achieve accurate control.

Figure 4.2.1 Flow chart for the control algorithm

16

Page 17: ECE532: Laser-tracking Remote Control Car

5. External Hardware Components

5.1 Remote Controller The logic circuit of the remote controller can be demonstrated in Figure 5.1.1. As is shown, the remote controller consists of four digital switches, each of which is connected to the ground of the system. A signal of 0 would be given to the circuit if the switch is pressed, or else a signal of 1 would be given. When a signal of 0 is given, this will cause the car to either go forward, backward, turn right or left, depending on which switch is pressed.

By detecting and connecting the points where the four switches are connected to the controller circuit, we can have the controller ready for connection with the logic circuit.

Figure 5.1.1 Remote controller logic circuit

5.2 Buffer and System Connections

5.2.1 Use of Buffer

When connecting the controller to the FPGA, we have no idea whether the FPGA has enough power to drive the logic circuit. Therefore, in order to ensure enough power to drive the circuit, and also to protect the GPIO ports on the FPGA, we decided to implement a buffer between the controller and the FPGA (as shown in Figure 5.2.1)

5.2.2 GPIO Connections

The input of the buffers are connected to the FPGA via the Pmod connections (as shown in Figure 5.2.2). We can literally control the inputs to the controller by four bits that are generated from the axi_pgio instance named switches_gpio_0 that are previously defined in the the

17

Page 18: ECE532: Laser-tracking Remote Control Car

hardware. For example, if we write 0xF (or 1111 in binary) to the last four bits, this will cause the car to a full stop. When 0xE (or 1110 in binary) is written, this will cause the car to go forward.

The section of the code in the UCF file that defines the connection of the Pmod GPIO connectors is also shown below:

NET "Pmod_O[0]" LOC = "T3" | IOSTANDARD = "LVCMOS33"; NET "Pmod_O[1]" LOC = "R3" | IOSTANDARD = "LVCMOS33"; NET "Pmod_O[2]" LOC = "P6" | IOSTANDARD = "LVCMOS33"; NET "Pmod_O[3]" LOC = "N5" | IOSTANDARD = "LVCMOS33"; NET "Pmod_O[4]" LOC = "V9" | IOSTANDARD = "LVCMOS33"; NET "Pmod_O[5]" LOC = "T9" | IOSTANDARD = "LVCMOS33"; NET "Pmod_O[6]" LOC = "V4" | IOSTANDARD = "LVCMOS33"; NET "Pmod_O[7]" LOC = "T4" | IOSTANDARD = "LVCMOS33";

This ensures the corresponding ports are correctly connected to the axi_bus interface and can be accessed by the MicroBlaze.

The entire connection of the physical interface between the FPGA and the remote controller is shown in Figure 5.2.3. One thing to be mentioned is that the controller, the FPGA and the Buffer IC must have common ground.

Figure 5.2.2.Pmod connector with buffers

18

Page 19: ECE532: Laser-tracking Remote Control Car

Figure 5.2.3 Sketch of the entire physical connection between the FPGA and the remote controller

19

Page 20: ECE532: Laser-tracking Remote Control Car

6. Description of Design Tree Key items are listed below. Directories are bolded. pcores

hdmi_out_v1_00_a ­ Video output project from IP3 [2], as modified by our group. hdl

verilog user_logic.v ­ All image filtering and processing is implemented in this file.

vmodcam_v1_00_a ­ Video input project from IP1 [1]. workspace

car_project_0 src

car_project.c ­ Final software used for demonstration. Moves car forwards or backwards depending on location of laser on screen. Also includes geometric algorithm.

doc Final_Report.pdf

20