demo code review wildilfe@home travis desell department of computer science university of north...

27
Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege Department of Biology University of North Dakota

Upload: della-harrington

Post on 02-Jan-2016

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

Demo Code ReviewWildilfe@Home

Travis Desell

Department of Computer ScienceUniversity of North Dakota

February 19, 2014

Grand Forks, ND

Susan Ellis-Felege

Department of BiologyUniversity of North

Dakota

Page 2: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

Code Review Requirements

Page 3: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

Code Review Requirements

You should have at least the following 5 sections in your presentation:1. Problem definition2. Demo3. Software Overview Diagram and Explanation4. Code Presentation & Database Schema (if

applicable)5. Progress & Future Work

Also:• Code must be in GitHub1.Code must be well commented2.At least 3 TODOs in comments per team member3.Specify what code/files have been written by what

team member

Page 4: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

1. Problem definition.

2. Demo.

3. Software Overview Diagram and Explanation

4. Code Presentation

5. Progress & Future Work

Overview

Page 5: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

Problem Definition

Page 6: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

During the summers of 2012 and 2013, Dr. Ellis-Felege gathered 50,000 hours of avian nesting video from the following species:

1.Sharp-tailed grouse (Tympanuchus phasianellus), an important game bird and wildlife health indicator species (~35.5k hours).

2.Piping plovers (Charadrius melodus), a federally listed threatened species (~3.3k hours).

3.Interior least terns (Sternula antillarum), a federally listed endangered species (~10.6k hours).

At least 60,000 more hours are expected by the end of the project.

Page 7: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

All three species are ground nesting birds.

Sharp-tailed grouse nest in the dense grass (top left). Nests were monitored in areas of high oil development, moderate oil development and no oil development (protected state land).

Piping plover and interior least tern are shore nesting species (top right). Nests were monitored along the Missouri River in North Dakota.

Sharp-tailed Grouse

Piping Plover

Page 8: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

What’s the point?

1. Current cameras that use automated motion detection miss some predators and are not robust enough).

2. Camera footage allows Dr. Ellis-Felege to manage and evaluate studies with large enough sample sizes for statistical significance.

3. Answer biological questions about parental investment and predator-prey interactions for these ground nesting species.

4. Examine the effect of oil development on wildlife in western North Dakota, which is experiencing a boom in fracking.

Page 9: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

Demo: Crowd Sourcing Interface

Page 10: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

http://volunteer.cs.und.edu/wildlife/

Wildlife@Home

Page 11: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

Volunteers enter observations via a a webpage which streams video, marking yes/no/unsure for each type of observation.

A validator awards users credit and accuracy when a quorum is reached for each observation type.

Page 12: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

Software Diagram and Explanation

Page 13: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

1. Video is brought back from the field (where there is no or limited Internet access) and uploaded to the servers.

Page 14: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

2. A daemon discovers newly uploaded videos and adds them to the video database.

Page 15: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

3. Another daemon converts the video into formats for web streaming (generating 3, 5, 10, and 20 minute segments) and for analysis by volunteered computers.

Page 16: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

4. Webpages determine what videos to display to users and allow them to record their observations.

Page 17: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

5. BOINC is used to send workunits out and collect results of the computer vision and motion detection techniques.

Page 18: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

6. A scientific web portal allows project scientists to enter their own video observations and compare crowd sourced results and automated techniques.

Page 19: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

Code Presentation

Page 20: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

Database Schema

Page 21: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

code overviewwatch.php is what displays the video and gives the radio buttons for users to select yes/no/unsure for various categories.

watch.js is used to handle all the elements in watch.php, and it will submit a users observations with the report_observations.php script.

This returns some JSON with results from the database that allow a modal to be populated with the observations from other users.

Finally, the crowd_observation_validator.cxx reads user observations from the database to determine if they are correctly validated against other observations and awards users credit.

Page 24: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

report_observation.php

https://github.com/travisdesell/wildlife_at_home/blob/master/webpage/report_observation.php

Page 25: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

crowd_observation_validator.cxx

https://github.com/travisdesell/wildlife_at_home/blob/master/boinc_daemons/

crowd_observation_validator.cxx

Page 26: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

Progress and Future Work

Page 27: Demo Code Review Wildilfe@Home Travis Desell Department of Computer Science University of North Dakota February 19, 2014 Grand Forks, ND Susan Ellis-Felege

Recent Progress and Future Work

The project now has had almost 200 users watch video, and many are highly active. Now over 7000 hours of video watched and validated from users, and there are over 300,000 observed videos.

Future work includes comparing user accuracy using various viewing segment lengths, and developing new interfaces to allow more users to provide more detailed descriptions of what’s happening in the video.