final project report

31
INDEX 1.Introduction 2. Problem Definition 3. Literature survey 4. Project Requirement Definition 4.1 Target users 4.2 User requirements 4.3 Hardware requirements 4.4 Software requirements 4.5 Language used 5. System requirement specification 5.1 Functional requirements 5.2 Non functional requirements 6.Gantt chart 7. System design 7.1 Design 8. Detailed architecture 8.1 Architecture 9.Implementation 10. Integration 11. Testing 11.1 Modules tested 11.2 Test strategy 11.3 Test report details 11.4 Testing details of modules on device 12. Screen shots 13. Conclusion 14. Future enhancement Bibliography

Upload: shruthi

Post on 15-Dec-2015

3 views

Category:

Documents


0 download

DESCRIPTION

Final year project report in autonomous colleges for Bachelor's in Engineering in Computer Science

TRANSCRIPT

Page 1: Final Project Report

INDEX

1. Introduction2. Problem Definition3. Literature survey4. Project Requirement Definition

4.1 Target users4.2 User requirements4.3 Hardware requirements4.4 Software requirements4.5 Language used

5. System requirement specification5.1 Functional requirements5.2 Non functional requirements

6. Gantt chart7. System design

7.1 Design8. Detailed architecture

8.1 Architecture9. Implementation10. Integration11. Testing

11.1 Modules tested11.2 Test strategy11.3 Test report details11.4 Testing details of modules on device

12. Screen shots13. Conclusion14. Future enhancement

Bibliography

Page 2: Final Project Report

1. Introduction

Need to locate your enemy air craft , monitor wildlife habitat ? You can do it through acoustic source location. Acoustic location is the science of using sound to determine the distance and direction of something. This can be done actively or passively.

Passive identification will be implemented in the project. Passive acoustic location involves the detection of sound or vibration created by the object being detected, which is then analyzed to determine the location of the object in question. The sound tracking localization and tracking is widely used to voice communications, security monitoring, anti terrorist and military field.

The object emitting sound is identified using an array of sound sensors and the coordinates are calculated through time delay estimation . This method can be extrapolated to bigger dimensions and prove to be very useful with a little capital investment.

Page 3: Final Project Report

2. Problem Definition

GPS ( Global Positioning System ) is a well known technology to all. 4 satellites installed up in the sky help determine the location of any object.

In case you want to determine the location of an object based on the sound it produces , how would that be done ? This is the underlying motivation behind this project. The sound produced by an object needs to be captured by sound sensors and an appropriate algorithm needs to be used to determine the time delay values and then calculate the coordinates.

Page 4: Final Project Report

3. Literature Survey

This project includes software, hardware and interfacing of both these worlds. A study was done on the following terms and concepts :

Five cross element array: Arrangement of sensors to track the sound source. (Five cross element array provides less impedance in the electrical circuit and more consistent results compared to linear arrays.)

Azimuth Angle: It is an angular measurement in a spherical coordinate system. The vector from an observer (origin) to a point of interest is projected perpendicularly onto a reference plane; the angle between the projected vector and a reference vector on the reference plane is called the azimuth.

Time delay estimation:

Active and Passive acoustic location : It is the science of using sound to determine the distance and direction of something. Location can be done actively

Page 5: Final Project Report

or passively, and can take place in gases (such as the atmosphere), liquids (such as water), and in solids (such as in the earth).

Active acoustic location involves the creation of sound in order to produce an echo, which is then analyzed to determine the location of the object in question.

Passive acoustic location involves the detection of sound or vibration created by the object being detected, which is then analyzed to determine the location of the object in question.

Cross-correlation: It is a measure of similarity of two waveforms as a function of a time-lag applied to one of them. Cross-correlations are useful for determining the time delay between two signals, e.g. for determining time delays for the propagation of acoustic signals across a microphone array. After calculating the cross-correlation between the two signals, the maximum (or minimum if the signals are negatively correlated) of the cross-correlation function indicates the point in time where the signals are best aligned, i.e. the time delay between the two signals is determined by the argument of the maximum.

MATLAB : MATLAB is a numerical computing environment and an interpreted programming language. MATLAB allows easy matrix manipulation, plotting of functions and data, implementation of algorithms and creation of user interfaces.The reason for choosing MATLAB as the analysis and simulation tool is that it has more flexible choices to support the simulation and is easy to do modification

Page 6: Final Project Report

or data recording. The simulation is carried out in both simulated and actual noisy environments.The tools we have worked till now: TMtool : Locating hardware for MATLAB Data Acquisition tool box : input data for MATLAB

Sensors - Clap switches are used and their capability of detecting a clap has helped in the

time delay estimation algorithm. These sensors have a sensitivity range of 5 m which can be enhanced using

amplifiers. More sensitive sensors like piezo electric sensors are more expensive and a

whole sensor needs to be designed in case these are used.

8051 microcontroller - No need to program for EPROM separately in 8051. Widely available and cost effective. Comfortable to program with the assembly language program for 8051.

4. Project Requirement Definition

4.1 Target Users :

4.2 User requirements :

4.3 Hardware requirements :

Page 7: Final Project Report

i. Clap switches

Used to capture the sound signal. The input required is 12V.The output given is also 12v.

ii. 8051 microcontroller

To interface the sensors with the computer Features:4K bytes ROM

128 bytes RAM Four 8-bit I/O ports Two 16-bit timers Serial interface 64K external code memory space 64K external data memory space Boolean processor(operates on single bits) 210 bit-addressable locations 4 ‘mu’s multiply/divide

iii. Buzzer

A sound device used to make sound to be detected.

iv. CP201X_UART_TO_USB_CONVERTER

Used as a connector to interface between the software program and the microcontroller.Device Descriptor –

bcdUSB: 0x0110bDeviceClass: 0x00bDeviceSubClass: 0x00bDeviceProtocol: 0x00bMaxPacketSize0: 0x40 (64)idVendor: 0x10C4idProduct: 0xEA60bcdDevice: 0x0100iManufacturer: 0x01

Page 8: Final Project Report

iProduct: 0x02iSerialNumber: 0x03bNumConfigurations:0x01ConnectionStatus: DeviceConnectedCurrent Config Value:0x01Device Bus Speed: FullDevice Address: 0x02Open Pipes: 2

Endpoint Descriptor:bEndpointAddress: 0x81Transfer Type: BulkwMaxPacketSize: 0x0040 (64)bInterval: 0x00

Endpoint Descriptor:bEndpointAddress: 0x01Transfer Type: BulkwMaxPacketSize: 0x0040 (64)bInterval: 0x00

4.4 Software requirements :

a. Flash Magic

Flash Magic is an application developed by Embedded Systems Academy to allow you to easily access the features of a microcontroller device. With this program you can erase individual blocks or the entire Flash memory of the microcontroller. Using Flash Magic, you are able to perform different operations to a microcontroller device, operations like erasing, programming and reading the flash memory, modifying the Boot Vector, performing a blank check on a section of the Flash memory and many others. Version 2.4.0 ( DLL version 1.90 ).

Page 9: Final Project Report

b. Visual Studio express 2012 –

Microsoft Visual Studio is an integrated development environment (IDE) from Microsoft. It is used to develop console andgraphical user interface applications along with Windows Forms or WPF applications, web sites, web applications, and web services in both native code together with managed code for all platforms supported by Microsoft Windows, Windows Mobile, Windows CE, .NET Framework, .NET Compact Framework and Microsoft Silverlight.Visual Studio supports different programming languages by means of language services, which allow the code editor and debugger to support (to varying degrees) nearly any programming language, provided a language-specific service exists. Built-in languages include C/C++[5] (via Visual C++), VB.NET (via Visual Basic .NET), C# (via Visual C#), and F# (as of Visual Studio 2010[6]). Support for other languages such as M, Python, and Ruby among others is available via language services installed separately. It also supports XML/XSLT, HTML/XHTML, JavaScript and CSS. Individual language-specific versions of Visual Studio also exist which provide more limited language services to the user: Microsoft Visual Basic, Visual J#, Visual C#, and Visual C++.

c. Gnuplot –

gnuplot is a command-line program that can generate two- and three-dimensional plots of functions, data, and data fits. It is frequently used for publication-quality graphics as well as education. The program runs on all major computers and operating systems (GNU/Linux, Unix, Microsoft Windows, Mac OS X, and others). It is a program with a fairly long history, dating back to 1986. Despite its name, this software is not distributed under the GNU General Public License (GPL), but its own more restrictive open source license.

4.5 Language used :

The project has three major coding modules.

1. The algorithm to determine time delay which is done is 8051 assembly language .

2. The interfacing of the 8051 to the computer to read data that is sent serially on the comm port . This piece of coding is done in C++ using the msdn Microsoft library.

Page 10: Final Project Report

3. The algorithm to calculate the coordinates once the time delay values are got. This is written in C++ too.

5. System requirement specification

5.1 Functional requirements : All clap switches should trigger in the range of a 4m diameter circle. The sound produced should be in the range of 300hz-600hz. The 8051 should capture and send data successfully on the serial port. The C++ program should be able to read the data in to a file accurately. The algorithm should run correctly to display the coordinates.

5.2 Non functional requirements : There should not be any significantly loud ambient noise. The floor where the sensors are placed should not be hollow and produce

extreme vibrations.

Page 11: Final Project Report

6. Gantt Chart

Page 12: Final Project Report

7. System Design

Page 13: Final Project Report

Symbols used in the circuit diagram:

Page 14: Final Project Report

RS232 cable connected to laptop

O(0,0) ORIGIN

L Distance between origin and the reference sensor (S0)

d Distance between each sensors

S0 Reference Sensor

S1, S2, S3, S4 4 sensors

8. Architecture

Page 15: Final Project Report

Block Diagram

9. Implementation

8051 Micro controller connected to 10

sensorsRS-232

Laptop with C-softwareUSB

RS232 to USB converter

C software – interface communication and

calculation of coordinates elaborated below:

Interface communication

Read data from com

ports

Send 2 sets of 4 values of time

delay to calculation part of

softwareCalculation of coordinates

x,y,z coordinate of target !

Page 16: Final Project Report

1. Time delay estimation algorithm2. Hardware connections3. Interfacing the hardware with the software4. Software coding of the algorithm

1.Time delay estimation algorithm :

Time delay estimation is the delay between the 2 sensors capturing the signal.

The different methods to calculate time delay in MATLAB are cross-correlation (CC), phase transform (PHAT), maximum likelihood estimator (ML), adaptive least mean square filter (LMS) and average square difference function (ASDF).

In this project, a new approach has been tried and the time delay values have been calculated by the 8051 microcontroller.

The above result is achieved by programming in 8051 assembly language.

2.Hardware Connections :

This prototype requires 2 identical modules. Each module contains 1 8051 microcontroller, 5 sensors ( clap

switches ) and one UART to USB cable and 2 adaptors. One adaptor powers the 8051 and the other powers the 5 sensors. The 8051 is connected to the computer through a UART to USB cable. The output of the sensors is tapped from the resistor R4 and fed into

the 8051 using jumpers through an IC ( ULN 2083) which converts 12v to 5v.

PHOTO OF THE TAPPED OUTPUT AND CIRCUIT DIAGRAM OF CLAP SWITCH

Each sensors’ components was soldered on the PCB and the output wire was connected and soldered as shown in the diagram.

3.Interfacing the hardware with the software :

Page 17: Final Project Report

The UART to USB converter is detected as a COM port on the computer and to read the incoming data serially, a code in C++ using the msdn library is written.

The IDE Visual Studio Express 2012 is used. Pseudocode :

The com port number is registered CreateFile() is used to open a handle for the specified com port Error checking is done and com status is set and got through

SetCommState() and PrintCommState() Properties like Baud rate, byte size, parity, stop bits are set The file to read data into is opened A loop to read a given number of characters is run and data is

read into the file This can be done for multiple com ports sequentially in the

same program

4.Software coding of the algorithm :

There are two formulas given in the paper : o For x,y,z coordinates when the object is in the 3D planeo For x,y coordinates when the object is in the 2D plane

Since the equation given in the paper was wrong, modification is done through mathematical calculations.

Equation given in the paper for the 2D plane : o X= [ ( tanα1+ tanα2 ) / (tanα2 - tanα1 ) ] * L o Y= 2* [ ( tanα1* tanα2 ) / (tanα1 + tanα2 ) ] * L

Modified Equation for 2D plane :o X= [ ( tanα1+ tanα2 ) / (tanα2 - tanα1 ) ] * L o Y= 2* [ ( tanα1* (-tanα2 ) ) / (tanα1 - tanα2 ) ] * L

Modified Equation for 3D plane :o sin = (c/(2*d))*sqrt((t3-t1)β 2+(t4-t2)2)o z =( tan(90-β1)*y )/sinα1

After the values are read from the com port, the values are used and integrated with the algorithm coded in c++.

Pseudocode : The file is opened to extract the time values in hexadecimal

read in serially from the 8051. The values are converted to decimal values. The decimal values are multiplied by 1.08*10-6 to convert the

number of cycles with respect to 8051 into time in seconds.

Page 18: Final Project Report

For every module, the time of each sensor is subtracted from the reference sensor and four time delay values are thus calculated. This is done for the other module too.

Values of r01, r02, α1, α2, β1 are calculated. The x, y, z coordinates are then calculated using the equations

above.

Page 19: Final Project Report

10 .Integration

11. Testing

Page 20: Final Project Report

11.1 Modules tested

SOFTWARE

1. Function for multiplying two matrices 2. Function for finding the inverse of a matrix 3. Function to convert decimal numbers to hexadecimal numbers 4. Reading the file properly using file functions

INTERFACING

5. Testing the serial connection ( 8051 sending data serially to the computer ) Sending a simple letter ‘A’ and reading it on the hyper terminal Sending the values stored in the registers of 8051 and reading it on the

hyper terminal6. Writing of a file when reading from comm port

HARDWARE

7. Testing the 10 sensors separately for their sensitivity and accuracy Tested the range of the sensors by clapping at different positions. The

detectable range is around 4m. This works only if the frequency of the sound is in the clap frequency

range8. Testing both the 8051’s separately

11.2 Test Strategy

Unit testing if the modules described above

Component testing : One module = 5 sensors + 1 8051 + 1 UART to USB cable

1. Testing the 8051 with only one sensor

2. Testing the 8051 with 3 sensors

3. Testing the 2 modules separately and checking the results obtained with theoretical values.

Integration testing :

1. Testing both the modules together and checking for consistency of values.

Page 21: Final Project Report

11.3 Test report details

Component testing details :

Theoretical required time delay values ( in ms)

Practical values (in ms)

1.8 1.821.791.791.815.084.25

Integration testing details :

Serial No. Theoretical Value Practical Value

With L=0.75m and d=0.5m

1. 0,0.5 0.01,0.400.18,0.5

2. -0.25,0.5 0.08,0.80.03,0.77-0.25,0.49-0.4,0.28-0.27,0.41-0.21,0.6

3. 0.25,0.5 0.04,0.40.31,0.3990.31,0.40.44,0.45

4. 0,-0.5 0.01,-0.5-0.00,-0.490.01,-0.490.6,-0.50.1,-0.5-0.18,-0.38

Page 22: Final Project Report

0.05,-0.50.02,-0.480.06,-0.5

With L=1m and d=0.5m

5. 0.5,0.5 0.5,0.420.49,0.390.48,0.400.79,0.090.97,-0.452.14,-1.250.11,0.41

12. Screen shots

Page 23: Final Project Report

13. Conclusion

A prototype has been built to identify the location of an object emitting sound. This can prove to a be a very helpful design and can be extrapolated to a lot of areas like home security, army aircrafts, wildlife tracking, conference rooms,etc.

Page 24: Final Project Report

14. Future Enhancements

1. Sensors with high sensitivity can be used to improve the coverage of detection.

Page 25: Final Project Report

2. The analog signal captured by the sensors can be converted to a digital wave and methods like cross correlation can be used to identify the time delay values. This approach is helpful for identifying the paths of moving objects.

3. A camera can be installed in such a way that when the coordinates are determined , it focuses in that direction. This can be used in conference rooms where the sound captured is that of a speaker and the camera can focus and zoom in on the person speaking.