photon.barreiro.ucsd.eduphoton.barreiro.ucsd.edu/124f17/f/james_mccune.docx · web viewlike the...

16
RUNNING HEAD: LED-GUIDED OBSERVATION CAMERA FOR PETS (LOC-MORP) LED-GUIDED OBSERVATION CAMERA FOR PETS (LOC-MORP) Phys 124, Fall 2017 Jordan J. (A12311135) & Morgan M. (A13122231) Introduction of Project Our project was inspired and developed following an issue involving the observation of rodents’ whiskers based on their reaction to stimuli. The problem researchers were having was an inability to record footage in enough detail to observe the whiskers in combination with arranging the environment desired for the observation. A device that could resolve this is one that can track the rodent as it traveled around the environment, without the tedious task of having the researcher consistently follow the animal throughout the test being conducted. This inspired our project: LOC-MORP 1 . LOC-MORP is a large device that is designed to be fitted over an arena environment as to which a small animal test subject can undergo an experimental analysis undisturbed by any clumsy researcher without sacrificing the capability to track the subject in high detail. The device is created to take the place of the researcher recording the footage--taking better observations in closer, steadier detail--allowing the researcher to use their time more efficiently and create better results. Plans of the Project Most of the initial planning for LOC-MORP went into the selection of the type of tracking device as well as the configuration of the system manifold that would hold the high-resolution camera. A major topic of discussion was the cost of the build plan that we would decide on--this was ultimately relieved by the UCSD Professor Dr. Kleinfeld who endorsed the project through his lab funds. 1 In our Demonstration 12.14.2017, our project was introduced and referred to as “Whiskers”.

Upload: lamhanh

Post on 12-Mar-2018

215 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: photon.barreiro.ucsd.eduphoton.barreiro.ucsd.edu/124f17/f/James_Mccune.docx · Web viewLike the printer, our device would be a 2-Dimensional rail system. One of the emerging differences

RUNNING HEAD: LED-GUIDED OBSERVATION CAMERA FOR PETS (LOC-MORP)

LED-GUIDED OBSERVATIONCAMERA FOR PETS (LOC-MORP)

Phys 124, Fall 2017Jordan J. (A12311135) & Morgan M. (A13122231)

Introduction of ProjectOur project was inspired and developed following an issue involving the observation of rodents’ whiskers based on their reaction to stimuli. The problem researchers were having was an inability to record footage in enough detail to observe the whiskers in combination with arranging the environment desired for the observation.

A device that could resolve this is one that can track the rodent as it traveled around the environment, without the tedious task of having the researcher consistently follow the animal throughout the test being conducted. This inspired our project: LOC-MORP1.

LOC-MORP is a large device that is designed to be fitted over an arena environment as to which a small animal test subject can undergo an experimental analysis undisturbed by any clumsy researcher without sacrificing the capability to track the subject in high detail. The device is created to take the place of the researcher recording the footage--taking better observations in closer, steadier detail--allowing the researcher to use their time more efficiently and create better results.

Plans of the ProjectMost of the initial planning for LOC-MORP went into the selection of the type of tracking device as well as the configuration of the system manifold that would hold the high-resolution camera. A major topic of discussion was the cost of the build plan that we would decide on--this was ultimately relieved by the UCSD Professor Dr. Kleinfeld who endorsed the project through his lab funds.

It was from this point that we went to drawing, and decided to base the foundational structure and design of the device after a 3-D printer. Like the printer, our device would be a 2-Dimensional rail system. One of the emerging differences is that it would carry and transport a camera over the field, and that would be guided from a separate secondary camera off to the side of the device looking down into the arena. This secondary camera method is such that we can actively track the mouse that is moving about the arena in a seperate system than the moving parts of the 2-D device.

We arranged our plans into a project proposal, which was presented to Professor Barreiro and staff--it was approved! After the approval, we ordered parts from “Open Builds”, an intensive hobby website that sold the required rail systems for laser printers.

1 In our Demonstration 12.14.2017, our project was introduced and referred to as “Whiskers”.

Page 2: photon.barreiro.ucsd.eduphoton.barreiro.ucsd.edu/124f17/f/James_Mccune.docx · Web viewLike the printer, our device would be a 2-Dimensional rail system. One of the emerging differences

LED-GUIDED OBSERVATION CAMERA FOR PETS (LOC-MORP)

Process of ConstructionThe Open Builds rail system parts took a longer time to ship to UCSD than expected--it was such that the actual construction did not take place until a few weeks after the project proposal was approved.

Structural & Material ConstructionIn the time leading up to the arrival of the rail system, our group worked to seek ways to code for the Stepper motors that came with the rail set (this is different than the DC Motors submitted in the project proposal plan). The camera to be used to track the LED in the arena was the Open MV--a popular device among our Phys 124 class that could effectively fulfill LOC-MORP’s requirements.

The rail system was easy to build because it included step-by-step instructions via video. However, the weaknesses of the system slowly became apparent: the screws in the set would strip after a few uses, the structure of the rail system used a brittle acrylic material, and the scale of the system influenced its stability.

These issues took most of our effort to resolve, and it involved us changing some of the previous plans that we had made during the proposal. Frequently during construction, the acrylic corners and manifolds for the motors would crack, snap, and break--causing us to make consistent repairs. The structure to raise the device had to be upgraded to handle the balance.

Please see attachment “STRUCTURE” for diagram of information.

Electronics & Software DevelopmentThe wirings for the two stepper motors were included in the Open Builds kit--all we had to do was connect the respective terminals to the stepper drivers and make sure there was a communication loop ongoing between the Open MV’s sensor detection & calculations with the Stepper driver boards. This, however, proved to be one of the more difficult things of the project.

Fig. 1, Above: Construction of the 2-D Rail System and Structural Support

Page 3: photon.barreiro.ucsd.eduphoton.barreiro.ucsd.edu/124f17/f/James_Mccune.docx · Web viewLike the printer, our device would be a 2-Dimensional rail system. One of the emerging differences

LED-GUIDED OBSERVATION CAMERA FOR PETS (LOC-MORP)

Rather than purchasing the $200+ Stepper Driver from the Open Builds site, we decided to give it a try on our own to develop a working stepper driver using material scraps we found in the lab. We recovered stepper driver boards from 2009, and with advice & guidance from Prof. Barreiro we wired H-Bridges in Parallel in the board to reach the current requirements of the two stepper motors (they ask for 3 Amps--which exceeded the maximum current that could be delivered through a single bridge).

A combination of two H-Bridges wired in parallel and a stepper driver board could power a single stepper motor. These H-Bridges would become incredibly hot, so only after 20-30 seconds

of use we would have to shut the power supply off to prevent overheating.

We applied thermal paste to the bridges and placed heat sinks above them as well to further prevent damaging of the components and breadboard from heat.

Please see attachment “ELECTRONICS” for diagram of information.

After-Action Report of ConstructionThe process of construction for the weeks after acquiring the materials proved to be difficult. The issues mainly sprouted from the respective interfaces that our group was dealing with: the open MV camera software, the stepper driving boards, and the coding for the use of the stepper motors in response to observations from the open MV camera.

Event after the disadvantage of snapped acrylic parts on the rail system, our group encountered issues with the functioning of the stepper motors. As explained before, they require a more current to be drawn than the smaller stepper motors that were provided in class to do the testing for the code. Even though we could easily match the required voltage range to go through the boards, we were only able to approach the amount of current needed by wiring the bridges in parallel.

The Overall structure of the device is impressive. It has the capability to be held up on its own and does not sacrifice stability when the motors are jolting about on their respective belts to move around. The structural beams and support was a simple addition to the ‘raising posts’ to make it so it wouldn’t fall over.

Fig. 2, Above: Stepper Driver w/ H-bridges in parallel.

Page 4: photon.barreiro.ucsd.eduphoton.barreiro.ucsd.edu/124f17/f/James_Mccune.docx · Web viewLike the printer, our device would be a 2-Dimensional rail system. One of the emerging differences

LED-GUIDED OBSERVATION CAMERA FOR PETS (LOC-MORP)

Does it Fulfill the Requirements?For this section, we seek to explain if the device fulfills the requirements that were set forth in the beginning of the quarter. These requirements are ones that are to accompany a successful build. We ran into a lot of obstacles during our building process, so it was always good to refer to these ‘requirements’ of a successful build to check that we were on track.

Measure/Sense/PerceiveThe LOC-MORP uses the open MV camera to observe the environment to find the LED and track the animal that it is attached to. Process/Calculate/ThinkThe computer uses the information gathered by the camera to calculate the location of the LED within the device’s arena field. The distance of the LED is calculated using what the computer can observe from the location of the point source of light in the pixel frame.This location is calculated into commands for the stepper motors to act upon and move the camera to the location that the LED is at inside of the arena.

Act/React/DoThe stepper motors carry out the commands for their respective dimensions of freedom (X & Y), bringing the camera attached to one of the rails directly overhead the animal with the active LED attached.

Conclusions & Room for Renewal/GrowthIf our group were to complete with project again from the beginning, there are plenty of things that we would have done differently to alleviate the burden that we experienced near the end of project development. Conclusive ideas and reflections are as follows:

Research is key! Many of the obstacles that we faced during the development of this device could have been either avoided or well prepared for ahead of time with the right amount of investigative research. There was a huge learning experience from hitting these obstacles, but it was very frustrating to have been confronted with so many.Research is a way to avoid these confrontations such that the building, testing, and working process goes a lot smoother.

This project is going to be continuously worked on in the future such that it will be completed and used in Dr. Kleinfeld’s lab in due time. The struggles we had with the stepper driving boards will be resolved with the acquisition of more capable drivers, and the device from here will be tested in the environment that it is destined to operate within--this is an ongoing development.

This device was one of many applications of a rail system, and our group was proud of the experiences that were made in the planning and construction. It is now understood by the both of us the severity of planning when it comes to creating a project, and the carefulness that it takes when working with materials.

Page 5: photon.barreiro.ucsd.eduphoton.barreiro.ucsd.edu/124f17/f/James_Mccune.docx · Web viewLike the printer, our device would be a 2-Dimensional rail system. One of the emerging differences

LED-GUIDED OBSERVATION CAMERA FOR PETS (LOC-MORP)

Appendix:STRUCTURE:

Page 6: photon.barreiro.ucsd.eduphoton.barreiro.ucsd.edu/124f17/f/James_Mccune.docx · Web viewLike the printer, our device would be a 2-Dimensional rail system. One of the emerging differences

LED-GUIDED OBSERVATION CAMERA FOR PETS (LOC-MORP)

Page 7: photon.barreiro.ucsd.eduphoton.barreiro.ucsd.edu/124f17/f/James_Mccune.docx · Web viewLike the printer, our device would be a 2-Dimensional rail system. One of the emerging differences

LED-GUIDED OBSERVATION CAMERA FOR PETS (LOC-MORP)

ELECTRICAL:

CODE:

The code is based on example scripts from the Open MV Cam M7 IDE. As of the current version it is not functional as a singular script, and requires further debugging to make it compatible with the LOC-MORP system. However, many of the individual components of the code do function independently of one another, and have been tested successfully.

The code overview is as follows:

1. Setup and definitions2. Creation of Stepper library (to allow control of the stepper motors using the interface of

our specific motor driver – this would need to be completely retooled for any other driver)

3. Camera setup (in this case, tracking for the color green. This was for demonstration/testing purposes, the functional version of the code would track a white light)

4. Begin loop:1. Initialize variables2. Take image from camera

Page 8: photon.barreiro.ucsd.eduphoton.barreiro.ucsd.edu/124f17/f/James_Mccune.docx · Web viewLike the printer, our device would be a 2-Dimensional rail system. One of the emerging differences

LED-GUIDED OBSERVATION CAMERA FOR PETS (LOC-MORP)

3. Locate object in image4. Convert image location to real-life location in 1-D5. Use PID to adjust motor speed and direction6. Apply calculated motor speed and direction to the motor

Code is as follows:

# ------ TestMotorController ------# Testbed for control of x dimension stepper motor,# given input from Open MV Cam M7

import sensor, image, time, pyb, mathfrom pyb import Timer

threshold_index = 1

thresholds = [(30, 100, 15, 127, 15, 127), (30, 100, -64, -8, -32, 32), (0, 30, 0, 64, -128, 0)]TIMER_ID = 14

steps_P_GAIN = 0.6steps_I_GAIN = 0.0steps_I_MIN = -0.0steps_I_MAX = 0.0steps_D_GAIN = 0.2

SPEED_P_GAIN = 1.0SPEED_I_GAIN = 0.0SPEED_I_MIN = -0.0SPEED_I_MAX = 0.0SPEED_D_GAIN = 0.0MAX_ACCEL = 2000DEFAULT_SPEED = 3000BLOB_THRESHOLD = 100BINARY_VIEW = False # low FPS if on, but good for debuggingDO_NOTHING = False #Just capture frames...

STEPS_PER_REV = 200 # given by the specs of the motor: 1.8 deg/phase -> 200 steps/revolutionIN_PER_REV = 0.032 # measured quantityIN_PER_PX = 30/320 # measured quantity

class Stepper:

""" Handles SN754410-based hardware driver for bipolar stepper motors """

steps = [ [1], [1, 2], [2], [2, 3], [3],

Page 9: photon.barreiro.ucsd.eduphoton.barreiro.ucsd.edu/124f17/f/James_Mccune.docx · Web viewLike the printer, our device would be a 2-Dimensional rail system. One of the emerging differences

LED-GUIDED OBSERVATION CAMERA FOR PETS (LOC-MORP)

[3, 4], [4], [1, 4] ]

def __init__(self, dir_pin, step_pin, enable_pin): self.step_pin = pyb.Pin(step_pin, pyb.Pin.OUT_PP) self.dir_pin = pyb.Pin(dir_pin, pyb.Pin.OUT_PP) self.enable_pin = pyb.Pin(enable_pin, pyb.Pin.OUT_PP) self.enable_pin.high() self.dir = 0 self.pulserate = 100 self.count = 0 self.speed = 0 self.MAX_ACCEL = 100 #equivallent to 100 x (periodicity of set_speed) usteps/sec/sec self.current_step = 0

def do_step(self): # called by timer interrupt every 100us if self.dir == 0: return self.count = (self.count+1)%self.pulserate if self.count == 0: self.step_pin.high() pass self.step_pin.low() self.record_step()

def set_speed(self, speed): #called periodically if (self.speed - speed) > self.MAX_ACCEL: self.speed -= self.MAX_ACCEL elif (self.speed - speed)< -self.MAX_ACCEL: self.speed+=self.MAX_ACCEL else: self.speed = speed # set direction if self.speed>0: self.dir = 1 self.dir_pin.high() self.enable_pin.low() elif self.speed<0: self.dir = -1 self.dir_pin.low() self.enable_pin.low() else: self.dir = 0 if abs(self.speed)>0: self.pulserate = 5000//abs(self.speed)

def set_off(self): self.enable_pin.high()

def get_speed(self): return self.speed

def get_current_step(self): return self.current_step

Page 10: photon.barreiro.ucsd.eduphoton.barreiro.ucsd.edu/124f17/f/James_Mccune.docx · Web viewLike the printer, our device would be a 2-Dimensional rail system. One of the emerging differences

LED-GUIDED OBSERVATION CAMERA FOR PETS (LOC-MORP)

def record_step(self): if self.speed > 0: self.current_step += 1 if self.speed < 0: self.current_step -= 1 if self.current_step == len(self.steps): self.current_steps = 0

def do_step(t): motor1.do_step()

def calculate_steps(x): motor_location = motor1.get_current_step() blob_location = cx * IN_PER_PX * STEPS_PER_REV / IN_PER_REV # some dimensional analysis to convert pixels to steps diff = blob_location - motor_location return diff

sensor.reset() # Reset and initialize the sensor.sensor.set_pixformat(sensor.RGB565) # Set pixel format to RGB565 (or GRAYSCALE)sensor.set_framesize(sensor.QVGA) # Set frame size to VGA (640x480)sensor.skip_frames(30) # Wait for settings take effect.sensor.set_auto_gain(False) # Must be turned off for color trackingsensor.set_auto_whitebal(False)#sensor.NEGATIVE # Turn on negative filter to invert colorsclock = time.clock() # Create a clock object to track the FPS.tim = Timer(TIMER_ID,freq=10000)

motor1 = Stepper('P7','P8','P9')

## ACTION ##

old_time = pyb.millis()

old_speed = Nonespeed_i_output = 0speed_output = DEFAULT_SPEED

old_steps = Nonesteps_i_output = 0steps_output = 0

while True:

clock.tick() # Update the FPS clock. img = sensor.snapshot() # Take a picture and return the image. #if BINARY_VIEW: img = img.binary(thresholds) #if BINARY_VIEW: img.erode(1,threshold = 2) if DO_NOTHING: continue

Page 11: photon.barreiro.ucsd.eduphoton.barreiro.ucsd.edu/124f17/f/James_Mccune.docx · Web viewLike the printer, our device would be a 2-Dimensional rail system. One of the emerging differences

LED-GUIDED OBSERVATION CAMERA FOR PETS (LOC-MORP)

for blob in img.find_blobs([thresholds[threshold_index]] , pixels_threshold=200 , area_threshold=200 , merge=True):

a = blob.area() cx = blob.cx() cy = blob.cy()

print_string = "" if a >= BLOB_THRESHOLD:

img.draw_rectangle(blob.rect()) new_time = pyb.millis() delta_time = new_time - old_time old_time = new_time

# # Figure out how many steps to take and run PID #

current_steps = calculate_steps(cx) steps_change = (current_steps - old_steps) if (old_steps != None) else 0 old_steps = current_steps

steps_p_output = current_steps # Standard PID Stuff here... nothing particularly interesting :) steps_i_output = max(min(steps_i_output + current_steps, steps_I_MAX), steps_I_MIN) steps_d_output = ((steps_change * 1000) / delta_time) if delta_time else 0 steps_pid_output = (steps_P_GAIN * steps_p_output) + \ (steps_I_GAIN * steps_i_output) + \ (steps_D_GAIN * steps_d_output) steps_output = round(steps_pid_output) print("FPS %f, steps %d" % (clock.fps(), steps_output))

# # Figure out speed and do speed PID #

current_speed = motor1.get_speed() speed_change = (current_speed - old_speed) if (old_speed != None) else 3000 old_speed = current_speed

speed_p_output = current_speed speed_i_output = max(min(speed_i_output + current_speed, SPEED_I_MAX),SPEED_I_MIN) speed_d_output = ((speed_change * 1000) / delta_time) if delta_time else 0 speed_PID_output = (SPEED_P_GAIN * speed_p_output) + \ (SPEED_I_GAIN * speed_i_output) + \ (SPEED_D_GAIN * speed_d_output) speed_output = max(min(round(speed_PID_output),100),0) # convert raw to a percentage

Page 12: photon.barreiro.ucsd.eduphoton.barreiro.ucsd.edu/124f17/f/James_Mccune.docx · Web viewLike the printer, our device would be a 2-Dimensional rail system. One of the emerging differences

LED-GUIDED OBSERVATION CAMERA FOR PETS (LOC-MORP)

print("FPS %f, speed %d" % (clock.fps(), speed_output))if speed_output =! None:

print_string = "Tracking Mouse - speed %d, steps %d" % (speed_output, steps_output)else: print_string = "Mouse Lost - speed %d, steps %d" % (speed_output , steps_output)

print("FPS %f, %s" % (clock.fps(), print_string)) n = 0 while n <= steps_output:

motor1.set_speed(speed_output) n = n + 1

pyb.delay(20)