three-dimensional particle imaging with a single camera

6
Experiments in Fluids 12, 353-358 (1992) Experiments in Fluids Springer-Verlag 1992 Three-dimensional particle imaging with a single camera C. E. Willert and M. Gharib Dept. of Applied Mechanics and Engineering Sciences, 041 l, University of California, San Diego, La Jolla, CA 92093, USA Abstract. A new approach to the instantaneous three-dimensional mapping of flow fields is introduced. A single camera system uses defocusing in conjunction with a mask (three pin holes) embedded in the camera lens to decode three-dimensional point sources of light (i.e., illuminated particles) on a single image. The sizes and locations of the particle image patterns on the image plane relate directly to the three-dimensional positions of the individual particles. Using sequential images, particles may be tracked in space and time, yield- ing whole-field velocity information. Calibration of the system is straightforward, whereas the self-similarity of the particle image patterns can be used in automating the data-extraction process. The described technique was used to obtain particle trajectories in the flow field of a vortex ring impinging on a wall. 1 Introduction In many fluid mechanical investigations, a representation of the flow field in terms of properties attached to individual particles of the fluid, a mode of description known as the Lagrangian representation, is closer to the physics of the situation than the more common Eulerian representation, where quantities such as velocity and temperature are given as fields dependent on space and time. The concept of chaotic advection, which recently has received serious atten- tion, is a good example in which Lagrangian information such as velocity and path lines of individual particles can be used to study technologically important issues of mixing and stirring. In this regard, the need for a technique capable of three-dimensional mapping of particle trajectories in fluid mechanical systems is self-evident, Stereoscopic imaging is a conventional three-dimensional mapping approach in which a particle-seeded flow is record- ed simultaneously from different angles. Particles are trian- gulated individually, and are tracked from frame to frame to recover velocity information and path lines (Elkins et al. 1977, Racca and Dewey 1988). To uniquely match many particles viewed from different angles, redundant informa- tion from a third angular position may be required (Papan- toniou and Mass 1990). Calibration of such a triple-imager setup is a difficult and lengthy process, and the systems are not very portable. Holography is relatively new to the field of three-dimen- sional flow mapping and it has yet to be brought to a stage where laboratory use is straightforward (Weinstein and Beeler 1986, Stanislas et al. 1986, Moraitis and Riethmuller 1990, Bernal and Scherer 1990). Data retrieval from the holographic plates is decidedly the most difficult part of the technique. The complex nature of the aforementioned techniques has made their implementation in common laboratory situ- ations infrequent. In this paper, we present a novel method that offers instantaneous whole field information with a sim- ple apparatus and straightforward data retrieval from the recorded images. 2 Range measurement using defocusing The main feature of the three-dimensional imaging system described here is the quantitative use of the amount of blur- ring of illuminated particles displaced from the reference plane, a plane parallel to the imager in which the imaged objects (particles) would appear in focus. This concept was used by Blais and Rioux (1986a, 1986b) to reconstruct a three-dimensional solid surface through the projection of a grid pattern onto the surface. In this communication we attempt to extend the defocusing technique to the field of particle tracing when an entire volume is reconstructed by imaging partricles in the flow field. Considering the fact that the particle distribution is random and dynamic, the present technique employs the defocusing method in an entirely dif- ferent application. A simplified representation of a typical imaging system is shown in Fig. 1, with examples of the projected images on the right. This figure contains three different planes of interest: the object (particle) is located in the object plane, whereas its image is projected onto the image plane. The reference plane marks the location at which objects will appear focused on the image plane. For example, a point A on the reference plane is projected as a focused point A' on the image plane (Fig. 1 a). A focused image of point source B would be ob- served at point B".

Upload: c-e-willert

Post on 06-Jul-2016

215 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Three-dimensional particle imaging with a single camera

Experiments in Fluids 12, 353-358 (1992)

Experiments in Fluids �9 Springer-Verlag 1992

Three-dimensional particle imaging with a single camera

C. E. Willert and M. Gharib

Dept. of Applied Mechanics and Engineering Sciences, 041 l, University of California, San Diego, La Jolla, CA 92093, USA

Abstract. A new approach to the instantaneous three-dimensional mapping of flow fields is introduced. A single camera system uses defocusing in conjunction with a mask (three pin holes) embedded in the camera lens to decode three-dimensional point sources of light (i.e., illuminated particles) on a single image. The sizes and locations of the particle image patterns on the image plane relate directly to the three-dimensional positions of the individual particles. Using sequential images, particles may be tracked in space and time, yield- ing whole-field velocity information. Calibration of the system is straightforward, whereas the self-similarity of the particle image patterns can be used in automating the data-extraction process. The described technique was used to obtain particle trajectories in the flow field of a vortex ring impinging on a wall.

1 Introduction

In many fluid mechanical investigations, a representation of the flow field in terms of properties attached to individual particles of the fluid, a mode of description known as the Lagrangian representation, is closer to the physics of the situation than the more common Eulerian representation, where quantities such as velocity and temperature are given as fields dependent on space and time. The concept of chaotic advection, which recently has received serious atten- tion, is a good example in which Lagrangian information such as velocity and path lines of individual particles can be used to study technologically important issues of mixing and stirring. In this regard, the need for a technique capable of three-dimensional mapping of particle trajectories in fluid mechanical systems is self-evident,

Stereoscopic imaging is a conventional three-dimensional mapping approach in which a particle-seeded flow is record- ed simultaneously from different angles. Particles are trian- gulated individually, and are tracked from frame to frame to recover velocity information and path lines (Elkins et al. 1977, Racca and Dewey 1988). To uniquely match many particles viewed from different angles, redundant informa- tion from a third angular position may be required (Papan- toniou and Mass 1990). Calibration of such a triple-imager setup is a difficult and lengthy process, and the systems are not very portable.

Holography is relatively new to the field of three-dimen- sional flow mapping and it has yet to be brought to a stage where laboratory use is straightforward (Weinstein and Beeler 1986, Stanislas et al. 1986, Moraitis and Riethmuller 1990, Bernal and Scherer 1990). Data retrieval from the holographic plates is decidedly the most difficult part of the technique.

The complex nature of the aforementioned techniques has made their implementation in common laboratory situ- ations infrequent. In this paper, we present a novel method that offers instantaneous whole field information with a sim- ple apparatus and straightforward data retrieval from the recorded images.

2 Range measurement using defocusing

The main feature of the three-dimensional imaging system described here is the quantitative use of the amount of blur- ring of illuminated particles displaced from the reference plane, a plane parallel to the imager in which the imaged objects (particles) would appear in focus. This concept was used by Blais and Rioux (1986a, 1986b) to reconstruct a three-dimensional solid surface through the projection of a grid pattern onto the surface. In this communication we attempt to extend the defocusing technique to the field of particle tracing when an entire volume is reconstructed by imaging partricles in the flow field. Considering the fact that the particle distribution is random and dynamic, the present technique employs the defocusing method in an entirely dif- ferent application.

A simplified representation of a typical imaging system is shown in Fig. 1, with examples of the projected images on the right. This figure contains three different planes of interest: the object (particle) is located in the object plane, whereas its image is projected onto the image plane. The reference plane marks the location at which objects will appear focused on the image plane. For example, a point A on the reference plane is projected as a focused point A' on the image plane (Fig. 1 a). A focused image of point source B would be ob- served at point B".

Page 2: Three-dimensional particle imaging with a single camera

354 Experiments in Fluids 12 (1992)

Reference Camera Lens Image plane d plane I s impl i f ied ) ( CCD sensor )

Fig. 1. a Large aperture (small f-stop) leads to strong image blurring due to small depth of field; b increased depth of field with small aperture (large f-stop); e off-axis shift of small aperture leads to proportional shift on image plane for de- focused objects; d two apertures generate two im- ages of defocused object

If the aperture within the lens is reduced (Fig. 1 b), the solid angle of light rays emanating from a point source B is reduced. As a result, the cone of light intersected by the imager subtends a smaller solid angle and the image B' will be blurred to a lesser extent. This effect is better known as the increase in depth of field due to a smaller aperture size (i.e., larger f-stop).

If the smaller aperture is shifted away from the centerline, as in Fig. 1 c, a proport ional shift is observed for the images of point sources B which are not on the reference plane. This is a direct result of the modified viewing angle of these point sources. The image shift tends to zero with decreasing dis- tance of the point sources to the reference plane.

The image shift caused by an off-axis aperture can be utilized for measurement by adding a second aperture on the opposite side of the centerline, which results in the projection of two blurred images B' from one point source B (Fig. 1 d). The separation b of the images B' can be directly related to the distance of the object to the lens/imager system, whereas the blurredness of the individual images is proportional to the distance of the source from the reference plane and to the size of the apertures.

Essential to the recovery of three component spatial data is the measurement of the separation b to recover the normal (depth) component Z and using the location of the self-sim- ilar image set to find the in-plane components X, Y. The

Page 3: Three-dimensional particle imaging with a single camera

C. E. Willert and M. Gharib: Three-dimensional particle imaging with a single camera 355

Z

z=L

//Object / l [x'Y'ZI

// z =0 _#_ ., x,y

Zo--

Z= ~ /

Fig. 2. Geometric analysis of lens/aperture system

Reference plane

Lens ptane

Image ptane

:Focusing ptane

corresponding geometrical analysis is shown in Fig. 2, in which the camera lens of focal length f is located at z = 0. Within the lens, two pinholes are separated a distance d/2 away from the center line. Using the well-known lens law and self-similar triangle relationships, the spatial coordi- nates of a point source (X, Y, Z) can be calculated using only a few easily determined constants:

1 Z - (1)

L-I+Kb ,

where

L - f K = (2)

f dL "

The remaining two coordinates X, Y are found from the geometrical center (Xo, Y0) of the image pair B' using:

X - - x~ Z ( L - f ) , (3) TL

- - Yo Z ( L - - f ) Y = (4)

f L

Solving (1) for the image separation b reveals several interesting performance characteristics of the lens/aperture system:

Three operating regimes can be distinguished: if the object (particle) is in front of the reference plane (i.e., Z < L), the separation b is greater than zero and a noninverted image is formed on the image plane. If, on the other hand, Z exceeds

L, that is, the object is not located in the region between the reference plane and the lens, an inverted image is formed. A mask with a triangular pinhole arrangement will exhibit a flipping of the triangular particle image pattern as the parti- cle passes through the reference plane (Z = 0). In the case of three pinholes in a triangular configuration this causes a flipped triangular pattern to be generated. Between these two regimes lies a region where the separation b becomes too small to be resolved because the images blur into each other (i.e., b ~ 0 as [ Z - LI~0) .

The best performance of the system is achieved for a maximum change in the separation b with respect to depth changes Z or, mathematically speaking, when the derivative:

~b - 1 - (6) OZ K Z 2

achieves a maximum value. Of course, there is no maximum in this "gain" function, but the region between the reference

plane and the lens exhibits the larger gain ~-~ than the

region past the reference plane. The former should therefore be chosen for optimum performance. Also, the gain factor K

�9 10b should be kept small to improve the overall gain ~ of the

system. This means that the pinhole (aperture) separation has to be kept as large as possible, whereas the lens focal length f and reference plane distance L are specific to the application. It is important to note that the pinhole diameter has no bearing on either the gain or the image separation b and is only responsible for the blurredness of the individual images.

An important issue for all types of instrumentation is resolution; in this case this is the uncertainty of locating a point in space given the information on the image plane. In this respect a distinction has to be made between the resolu- tion obtained normal to the image plane and parallel to the image plane, hereafter referred to as 6z and 6xr, respectively. On the image plane, resolution is ultimately defined by the array of picture elements (pixels) that make up the sensor. Particle images on the array usually span several pixels which permits their center to be computed to sub-pixel accu- racy. If the centroid (x 0, Yo) of a particle image set can be measured with an uncertainty of 5 in the image plane then, by using (3) or (4), the uncertainty of the particle in space is:

5xr=6[ --Z(l--f) =6[--ZdKI. (7)

The coefficients Z dK comprise the magnification factor be- tween the object (particle) and its image. The uncertainty normal to the object plane 6z can be derived from (6):

6 - 1 ~z = b ~ , (8)

where 6b is the uncertainty in measuring the particle image separation b on the image plane (i.e., pixel array). Typically

Page 4: Three-dimensional particle imaging with a single camera

356

the uncertainty 6b is of the order of 5, the uncertainty of locating an individual particle image. Based on this assump- tion, the depth versus in-plane resolution may be related by combining (7) and (8):

6z Z 6x~ ~ d " (9)

This is an interesting, but logical result because only the distance from the lens Z and the pinhole separation d enter. They define the included viewing angle of the object from the

Z pinholes. A larger viewing angle, that is, a smaller ~- ratio,

will improve the depth resolution with respect to the in- plane resolution.

3 Practical implementation

A quick demonstration of the above concept can be done by mounting the multiple aperture mask directly in front of the lens of a camera. This mask placement works well for long focal length lenses or large working distances L. As the lens viewing angle increases, as in the case of small focal lengths and small working distances L, the mask will block the view in some areas of the field and should be mounted within the lens. The best location for the multiple aperture mask within the lens is close to the existing adjustable aperture or iris.

In our initial defocusing experiments a dual pinhole mask generated too many ambiguous data. This was due to the difficulty of uniquely matching particle image pairs in dense- ly seeded flows. The addition of a third pinhole to form a triangular pattern resulted in the generation of easly identi- fiable, self-similar triangular particle image patterns on the image plane. Except for some minor modifications, the pre- viously derived equations remain valid.

Conventional CCD cameras can be used for the imaging of the three-dimensional flow field. In the experiment to be described in the following section, we used a Sony AVC-D1 monochrome CCD camera with a 25 millimeter lens con- taining the three pinhole mask. The mask's holes of 1.0 millimeter diameter from an equilateral triangle of 6 millime- ter sides. Lighting is provided by a strobe light which is synchronized to the video camera's framing rate (30 Hz). Bright illumination sources are required because the small apertures of the mask within the lens allow little light to be transmitted to the sensor. Here 200 ~tm white polymer spheres of near neutral buoyancy were used.

4 Application in an experiment

Particles in the flow field of a vortex ring impinging on a wall were tracked (Fig. 3) with the camera/aperture system. In- jected into the vortex generator, the particles moved with the vortex after its formation. Continuous sequences of up to

Experiments in Fluids 12 (1992)

30 Hro~igcga~eSriagnat / ~ r o b e tight

Mask implanted in t ens ~

Vortex generator

Ctear solid tank waLL order

or image digitizer Fig. 3. Experimental setup for imaging a seeded vortex ring imping- ing on a wall

30 s were recorded with a video recorder or digital video disk for later analysis. Three frames from one of these sequences are shown in Fig. 4. The individual images are separated b y 20 frames (0.66 s) and corresponding triplets are labeled to demonstrate their changing size and position. The imaged section is the upper region of the impinging ring as shown in Fig. 4 d.

Surprisingly, animated sequences of the kind shown in Fig. 4 already convey qualitative three-dimensional infor- mation on the global flow field to the viewer. It is thought that the human brain subconsciously assimilates the chang- ing size of the moving self-similar images to changes in depth. A computational scheme can be derived emulating the subconscious data-processing of the brain through the use of pattern-recognition schemes such as correlation and transforms.

Quantitative measurements were made from one digi- tized sequence of 200 frames (6.66 s), as the vortex ring reached the wall and expanded radially outward. Particle image triplets were first located in each frame by identifying self-similar triangles. For each particle image in the triplet a centroid was computed using a center-of-mass algorithm. From these centroids, the separation b was calculated by averaging the distances between the triangle corner points. The three-dimensionally located particles were tracked from frame to frame by looking for the nearest spatial particle location in a predictor/corrector scheme. Up to three paths are shown in Fig. 5. They clearly show that particles moving on orbits closer to the core have a greater orbital speed than particles further away from the core. Also, the three particles are separating from each other due to the spreading action of the impinging vortex.

Figure 5 represents raw, scaled data which still has to be converted to true spatial coordinates by taking into account the change of refractive indices, viewing angles, and all essen- tial calibration factors used in (1) through (4). A preliminary

Page 5: Three-dimensional particle imaging with a single camera

C. E. Willert and M. Gharib: Three-dimensional particle imaging with a single camera 357

ed volume

CC .=rence plane icte images )

Fig. 4 a-d. Sequence of three frames taken at 0.66 s intervals (20 frames) from video recording of a vortex ring impinging on a wall. Individual particles are labeled (A through E) for clarity. Particle E

d

is behind the focal plane, away from the vortex ring, and shows little movement

analysis of the described system showed that the in-plane resolution ~Sxr was roughly 40 times better than the depth resolution 5z. Here, the reference plane distance L was 250 mm, whereas the particle locations Z varied anywhere from 150 to 200 ram.

The described technique is also suitable for mapping the instantaneous elevation field of a free surface. Figure 6 shows a reconstruction of a free surface o n which floating, illuminated particles generated an image similar to Fig. 4.

5 Conclusion

Although many automated schemes for the identification and measurement of the self-similar particle images come to mind, they go beyond the scope of this communication. The primary objective was to introduce a new approach to three- dimensional particle imaging in experimental fluid dynamics research by bringing forward its simple design, calibration and use. The fundamental concept was described along with

Page 6: Three-dimensional particle imaging with a single camera

358

>O-- X3

t~

>OF X3

Fig. 5. Pathlines in space of vortex ring impinging of wall. Three separate plots of the same region are shown for clarity

a formulat ion of the governing equations which are felt to be valid to within first order accuracy. Not included in this formulation are phenomena such as lens aberration, resolu- t ion limitations due to diffraction, modulat ion transfer char- acteristics of the lens and others. Issues such as high seeding density and three-dimensional interpolation from randomly located velocity vectors have yet to be addressed. In spite of these, the authors feel that this single camera three-dimen-

Experiments in Fluids 12 (1992)

:ep-~

Fig. 6. Free surface elevation data obtained with defocusing tech- nique

sional imaging system will have a solid footing alongside other imaging techniques like holography and stereoscopy.

Acknowledgments

This communication was motivated through many inquiries on the subject after a presentation at the 43rd Annual APS/DFD Meeting in Ithaca, New York on the 23 -24 November of 1990. We would like to thank many of our fellow researchers for their encouragements leading to publication of the presented material. Also special thanks to D. Liepmann for his participation on the free-surface mapping experiment. This research was made possible through the financial support from the Office of Naval Research/Fluid Dynamics Divi- sion (Contract number N00014-89-J-1529).

References

Bernal, L. P.; Scherer, J. 1990: Holographic particle image ve- locimetry. 43rd annual meeting of the Division of Fluid Dynam- ics of Amer. Phy. Soc., Cornell University, Ithaca, New York, 18-20 November

Blais, F.; Rioux, M. 1986a: BIRIS: a simple 3D sensor. SPIE Pro- ceedings on Optics, Illumination, and Image Sensing for Ma- chine Vision 728, 235 242, 30 31 October, Cambridge, MA

Blais, F.; Rioux, M. 1986b: Compact three-dimensional camera for robotic applications. J. Opt. Soc. Am. A 3, 1518-1521

Elkins, R. E.; Jackman, G. R.; Johnson R. R.; Lindgren, E. R.; Yoo, J. K. 1977: Evaluation of stereoscopic trace particle records of turbulent flow fields. Rev. Sci. Instrum. 48, 738-746

Moraitis, C. S.; Riethmuller, M. L. 1990: Real time optical process- ing of holographically recorded particle images for 3-D whole field velocimetry. Proc. of the 5th Int. Symposium on Applica- tion of Laser Techniques to Fluid Mechanics. Lisbon: Instituto Superior Tecnico, LADOAN, Portugal, 9-12 July

Papantoniou D.; Maas, H.-G 1990: Recent acvances in 3-D particle tracking velocimetry. Proc. of the 5th Int. Symposium on Appli- cation of Laser Techniques to Fluid Mechanics. Lisbon: Institu- to Superior Tecnico, LADOAN, Portugal, 9-12 July

Racca, R. G.; Dewey, J. M. 1988: A method for automatic particle tracking in a three-dimensional flow field. Exp. Fluids 6, 25-32

Stanislas, M.; Rodriguez, O.; Dadi, M.; Beluche, E 1986: Applica- tion of high speed holography to aerodynamic and hydrody- namic three-dimensional velocimetry. AGARD-CP-413, Aero- dynamic and related hydrodynamic studies using water facilities, Monterey, USA, 20-23 October

Weinstein, L. M.; Beeler, G. B. 1986: Flow measurements in a water tunnel using a holocinematographic velocimeter. AGARD-CP- 413, Aerodynamic and related hydrodynamic studies using wa- ter facilities, Monterey, USA, 20 23 October

Received June 13, 1991