dynamic 3-dshapemeasurementmethod a review

14
Dynamic 3-D shape measurement method: A review Xianyu Su , Qican Zhang Opto-Electronics Department, Sichuan University, Chengdu 610064, China article info Available online 8 May 2009 Keywords: Dynamic 3-D shape measurement Grating projection Structured illumination Fringe analysis Fourier transform profilometry Phase unwrapping Stroboscope Rotating blade Vibration analysis abstract Three-dimensional (3-D) shape measurement for a dynamic object or process, whose height distributions is varying with the time, has been a hot topic in recent years due to its wide field of application. A number of techniques have been presented and in-depth studied. Among several non- contact 3-D shape measurements for a dynamic object or process, an optical 3-D measurement system, based on 2-D grating pattern projection and Fast Fourier transform (FFT) fringe analysis has been developed and widely used, due to its particularly merits of requesting a low-cost and easy-to-use equipments, recording the full-field information simultaneously, requiring only one frame of the deformed fringe pattern to reconstruct the height distribution with fast data processing. In this paper, after an overview of dynamic 3-D shape measurement techniques is presented, the basic principles and typical applications of this technique based on grating projected and fringe analysis, which attracts our attention and research effort in the past ten years, has been targeted as main objective to review. Finally, the high-definition real-time depth-mapping TV camera, a 2-D color imaging with high-resolution and depth sensing system, has been briefly restated as a good development trend of 3-D modeling, robotic and graphics animation. & 2009 Elsevier Ltd. All rights reserved. 1. Introduction Any object’s three-dimensional (3-D) curved surface can be represented by an array of points with known Cartesian coordinates (x,y,z), and then a mathematical model can be formulated to describe the surface shape based on these points. Therefore, the ultimate goals of 3-D shape measurement techni- ques, both mechanical and optical, is the determination of the Cartesian coordinates. For dynamic object and scene, the Cartesian coordinates of these points are varying with the time. From these variational Cartesian coordinates, other quantities, such as displacement and curvature, can be calculated. Optical non-contact 3-D shape measurement technique (namely 3-D sensing) [1,2] is concerned with extracting the geometry information from the image of the measured object. With the excellence in high speed and high accuracy, it has been widely used for 3-D sensing, machine vision, robot simulation, industrial monitoring, dressmaking, biomedicine, etc. It can be divided in to two types of techniques, active 3-D shape measure- ment and passive 3-D shape measurement. In passive 3-D shape measurement, for measuring dynamic object like human facial animation, Mova [3] provides 3-D performance capture services and technology for capturing the motion. For measuring dynamic object like dynamic deformation, using the fine-grid method [4], a regular pattern, such as a well- defined array of lines, is adhered to the measured sample’s surface, and an image of this grid is obtained before and after deformation. An automatic analysis of the dynamic deformation of this fine grid has allowed the full-field measurement of in- plane displacement [5]. Usually, by the basic physical principles, active 3-D shape measurement techniques is able to classify it to two categories: the one is based on time delay, which employs the speed of light or laser coherence to measure 3-D shape; the other is triangula- tion-based, which mainly used the structured pattern projection to demodulate the measured object’s information. The subdivisions of the first one are time-of-flight (TOF) and interferometry (optically coherent detection). In interferometry method, electronic speckle pattern interferometry (ESPI) has been an intense research subject in recent years and presents good performances for full-field vibration or rotation measurements [6–10]. In this area, for dynamic characterization of micro- electromechanical systems (MEMS), Doppler interferometry, optical microscopic interferometry, digital holography and strobo- scopic interferometry have been developed to achieve its full-field out-of-plane measurements [11–15]. The subdivisions of triangulation-based technique by its projected patterns are single spot projection, 1-D line-shaped light projection and 2-D fringe projection. In single spot projection method, accelerometers and laser doppler vibrometers (LDV) [16] are pointwise measurement techniques and are used in conjunc- tion with spectrum analyzers and modal analysis software to ARTICLE IN PRESS Contents lists available at ScienceDirect journal homepage: www.elsevier.com/locate/optlaseng Optics and Lasers in Engineering 0143-8166/$ -see front matter & 2009 Elsevier Ltd. All rights reserved. doi:10.1016/j.optlaseng.2009.03.012 Corresponding author. Tel.: +86 28 85463879; fax: +86 28 85464568. E-mail address: [email protected] (X. Su). Optics and Lasers in Engineering 48 (2010) 191–204

Upload: agg-glez

Post on 10-Dec-2015

225 views

Category:

Documents


2 download

DESCRIPTION

measurement on line

TRANSCRIPT

Page 1: Dynamic 3-Dshapemeasurementmethod a Review

ARTICLE IN PRESS

Optics and Lasers in Engineering 48 (2010) 191–204

Contents lists available at ScienceDirect

Optics and Lasers in Engineering

0143-81

doi:10.1

� Corr

E-m

journal homepage: www.elsevier.com/locate/optlaseng

Dynamic 3-D shape measurement method: A review

Xianyu Su �, Qican Zhang

Opto-Electronics Department, Sichuan University, Chengdu 610064, China

a r t i c l e i n f o

Available online 8 May 2009

Keywords:

Dynamic 3-D shape measurement

Grating projection

Structured illumination

Fringe analysis

Fourier transform profilometry

Phase unwrapping

Stroboscope

Rotating blade

Vibration analysis

66/$ - see front matter & 2009 Elsevier Ltd. A

016/j.optlaseng.2009.03.012

esponding author. Tel.: +86 28 85463879; fax

ail address: [email protected] (X. S

a b s t r a c t

Three-dimensional (3-D) shape measurement for a dynamic object or process, whose height

distributions is varying with the time, has been a hot topic in recent years due to its wide field of

application. A number of techniques have been presented and in-depth studied. Among several non-

contact 3-D shape measurements for a dynamic object or process, an optical 3-D measurement system,

based on 2-D grating pattern projection and Fast Fourier transform (FFT) fringe analysis has been

developed and widely used, due to its particularly merits of requesting a low-cost and easy-to-use

equipments, recording the full-field information simultaneously, requiring only one frame of the

deformed fringe pattern to reconstruct the height distribution with fast data processing. In this paper,

after an overview of dynamic 3-D shape measurement techniques is presented, the basic principles and

typical applications of this technique based on grating projected and fringe analysis, which attracts our

attention and research effort in the past ten years, has been targeted as main objective to review. Finally,

the high-definition real-time depth-mapping TV camera, a 2-D color imaging with high-resolution and

depth sensing system, has been briefly restated as a good development trend of 3-D modeling, robotic

and graphics animation.

& 2009 Elsevier Ltd. All rights reserved.

1. Introduction

Any object’s three-dimensional (3-D) curved surface can berepresented by an array of points with known Cartesiancoordinates (x,y,z), and then a mathematical model can beformulated to describe the surface shape based on these points.Therefore, the ultimate goals of 3-D shape measurement techni-ques, both mechanical and optical, is the determination of theCartesian coordinates. For dynamic object and scene, the Cartesiancoordinates of these points are varying with the time. From thesevariational Cartesian coordinates, other quantities, such asdisplacement and curvature, can be calculated.

Optical non-contact 3-D shape measurement technique(namely 3-D sensing) [1,2] is concerned with extracting thegeometry information from the image of the measured object.With the excellence in high speed and high accuracy, it has beenwidely used for 3-D sensing, machine vision, robot simulation,industrial monitoring, dressmaking, biomedicine, etc. It can bedivided in to two types of techniques, active 3-D shape measure-ment and passive 3-D shape measurement.

In passive 3-D shape measurement, for measuring dynamicobject like human facial animation, Mova [3] provides 3-Dperformance capture services and technology for capturing themotion. For measuring dynamic object like dynamic deformation,

ll rights reserved.

: +86 28 85464568.

u).

using the fine-grid method [4], a regular pattern, such as a well-defined array of lines, is adhered to the measured sample’ssurface, and an image of this grid is obtained before and afterdeformation. An automatic analysis of the dynamic deformationof this fine grid has allowed the full-field measurement of in-plane displacement [5].

Usually, by the basic physical principles, active 3-D shapemeasurement techniques is able to classify it to two categories:the one is based on time delay, which employs the speed of lightor laser coherence to measure 3-D shape; the other is triangula-tion-based, which mainly used the structured pattern projectionto demodulate the measured object’s information.

The subdivisions of the first one are time-of-flight (TOF) andinterferometry (optically coherent detection). In interferometrymethod, electronic speckle pattern interferometry (ESPI) has beenan intense research subject in recent years and presents goodperformances for full-field vibration or rotation measurements[6–10]. In this area, for dynamic characterization of micro-electromechanical systems (MEMS), Doppler interferometry,optical microscopic interferometry, digital holography and strobo-scopic interferometry have been developed to achieve its full-fieldout-of-plane measurements [11–15].

The subdivisions of triangulation-based technique by itsprojected patterns are single spot projection, 1-D line-shapedlight projection and 2-D fringe projection. In single spot projectionmethod, accelerometers and laser doppler vibrometers (LDV) [16]are pointwise measurement techniques and are used in conjunc-tion with spectrum analyzers and modal analysis software to

Page 2: Dynamic 3-Dshapemeasurementmethod a Review

ARTICLE IN PRESS

E ∞

Reference Plane

A

Object

-Z(x, y, 1)

EpEc

O

CD

Ec′

Ep′

d

G S

t = 3

H

l0

Projector CCD Camera

x

yz

B

t = 1t = 2

Fig. 1. Optical geometry of a dynamic 3-D shape measurement system based on

FTP.

X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204192

characterize the dynamic object, especially for the vibrationbehavior [17]. It is necessary to consume time in the 1-D or 2-Dscanning. A new point-array encoding method is proposed tocalculate object height directly through the affine transformation[18]. Yoshihiro Watanabe [19] uses a multi-spot leaser projectionmethod in which numerous pre-calibrated spots are projectedinto the measured dynamic object and a high-speed vision with aco-processor for numerous-point analysis to reconstruct the 3-Dshape of the dynamic object. In 1-D line-shape light projectionmethod, a variation of the machine vision techniques use line-shaped laser light combined with machine vision sensors to givethe repeatability advantages of coordinate measuring machines(CMMs) and the speed advantages of machine vision [20–23].

Unlike 1-D line-shape light projection method, 2-D fringeprojection method has the advantages of speed and providingwhole-field information. Several 3-D shape measurement meth-ods based on 2-D fringe projected, including moire technique(MT) [24–27], phase measuring profilometry (PMP) [28–32],Fourier transformation profilometry (FTP) [33–37], modulationmeasurement profilometry (MMP) [38–40], spatial phase detec-tion (SPD) [41–43], color-coded fringe projection [44–49], gray-coded binary fringe sequences [50–52] have been exhaustivelystudied. Some of them need a series of fringes to be projected or aphase shifting process to be executed, which is unfit formeasuring 3-D shape of a dynamic object. The technique, whichis fit for dynamic measurement, has been allowed using only onesingle image to recover the shape information, so it is also called‘one-shot’ structured light technique. A simple and stable gridpattern formed by a number of straight lines distinguishable onlyas vertical or horizontal lines with different colors has been usedto densely measure shapes of both static and dynamic scenes andobjects [48,49]. A set of gray-scale random stripe patterns havebeen projected onto a dynamic object, and then the time-varyingdepths maps have been recovered by space-time stereo method[51].

Actually, more commonly used fringe is 2-D Rochi orsinusoidal grating [53–61]. This grating is projected onto themeasured dynamic object’s surface to modulate its heightdistribution, and then a serial of the deformed fringe patternsare recorded from the other view and processed by a full-fieldfringe analysis to demodulate the variational 3-D shape informa-tion.

Among the studied fringe analysis, FTP [33–36], introducedfirst by Takeda, is particularly fit for its high-speed measurement,because of its merits, only one (or two) fringe(s) needed, full-fieldanalysis, and high precision, etc. In the past years, the FTP methodhas been extensively studied and improved. Two-dimensionalFourier transform and 2-D hanning filtering are applied to providea better separation of the height information from noise whenspeckle-like structures and discontinuities exist in the fringepattern [35]. Compared with MT, FTP can accomplish fullyautomatic distinction between a depression and protrusion ofthe measured object shape. It requires no fringe order assignmentor fringe center determination, and it requires no interpolationbetween fringes because it gives the height distribution at eachpixel over the entire fields. Compared with PMP and MMP, FTPrequires only one or two images of the deformed fringe pattern,and makes real-time data processing and dynamic data proces-sing possible [36,37].

Recently, a high-definition real-time depth-mapping television(TV) camera (HDTV Axi-vision camera) [62–66] has beenproposed. It achieves simultaneous 2-D color imaging with highresolution and depth sensing in real time. At the first appearance,this 3-D TV camera system’s main purpose might not be themeasurement of 3-D shape, but it can detect 3-D information andhas a wide range of applications in television program production,

3-D modeling, robotic vision and graphics animation. It could be agood development trend of dynamic 3-D shape measurement byspeeding it to approach real time naturally.

After an overview of the classification and achievements ofdynamic 3-D shape measurement techniques is presented, more-over the technique based on grating projected and fringe analysis,which attracts our attention and research effort in the past tenyears, has been well explained and illustrated in this review. It isorganized as follows. Section 2 gives a theoretical analysis of thebasic principles of dynamic 3-D shape measurement based onfringe projection and FTP. Section 3 describes the process of 3-Dphase calculation and unwrapping. Section 4 illustrates theapplications of this method in different fields. Finally, in Section5, 3-DTV, as a prominent development trend, has been reviewedin this paper also.

2. Basic principles of dynamic 3-D shape measurement

FTP for the dynamic 3-D shape measurement is usuallyimplemented as the following. A Ronchi grating or sinusoidalgrating is projected onto an object’s surface. Then, a sequence ofdynamic deformed fringe images can be grabbed by a CCD camerafrom the other view and saved in a computer rapidly. Next, dataare processed with three steps. Firstly, by using Fourier transform,we obtain their spectra, which are isolated in the Fourier planewhen sampling theorem is satisfied. Secondly, by adopting asuitable band-pass filter (e.g., a 2-D Hanning window) in thespatial frequency domain, all the frequency components areeliminated except the fundamental component. And by calculat-ing inverse Fourier transform of the fundamental component, asequence of phase-maps can be obtained. Thirdly, by applying thephase unwrapping algorithm in 3-D phase space, the heightdistributions of the measured object in different time can bereconstructed under a perfect phase-to-height mapping.

The optical geometry of the measurement system for dynamicobject is similar to traditional FTP, as shown in Fig. 1, in which theoptical axis E0p�Ep of a projector lens crosses the optical axisE0c�Ec of a camera lens at point O on a reference plane, which isnormal to the optical axis E0c�Ec and serves as a fictitiousreference to measure the object height Z(x,y). d is the distancebetween the projector system and the camera system, l0 is thedistance between the camera system and the reference plane.

Page 3: Dynamic 3-Dshapemeasurementmethod a Review

ARTICLE IN PRESS

X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204 193

By projecting a grating fringe onto the reference plane, thegrating image (with period p0) on the reference plane observedthrough the CCD camera on other view can be represented by

g0ðx; yÞ ¼Xþ1

n¼�1

Anr0ðx; yÞ expfi½2npf 0xþ nf0ðx; yÞ�g (1)

where r0(x,y) is a non-uniform distribution of reflectivity on thereference plane, An are the weighting factors of Fourier series, f0

(f0 ¼ 1/p0) is the fundamental frequency of the observed gratingimage, and f0 (x,y) is the original phase on the reference plane(i.e., Z(x,y) ¼ 0). The coordinate axes are chosen as shown in Fig. 1.

When the measured object is stationary, the image intensity,which is obtained by CCD camera, is independent of the time andusually expressed as g(x,y). But when a dynamic 3-D object, whoseheight distributions is varying with the time, is placed into theoptical field, the intensity of these fringe patterns is obviously afunction of the time and can be marked as g(x,y, z(t)), and thephase distribution which implicated the height variation of themeasured dynamic object is also a function of the time and can benoted as f (x,y,t). Strictly speaking, in dynamic scenes x and y

coordinates are also changing with time which are much smallerin comparison with z coordinates. Therefore, their changes areusually ignored. A sequence of the deformed fringe patterns canbe grabbed by CCD camera and stored into a computer rapidly.The intensity distributions of these fringe patterns in differencetime can be expressed as

gðx; y; zðtÞÞ ¼Xþ1

n¼�1

Anrðx; y; tÞ expfi½2npf 0xþ nfðx; y; tÞ�g

ðt ¼ 1; 2; . . . ; mÞ (2)

where r(x,y,t) and f(x,y,t) respectively represent a non-uniformdistribution of reflectivity on the object surface and thephase modulation caused by the object height variation in thedifferent time, m is the number of all fringe images grabbed byCCD camera.

Fourier transform, filtering only the first order term (n ¼ 1) ofthe Fourier spectra, and inverse Fourier transform are carried outto deal with each fringe pattern grabbed by CCD camera at thedifferent time. Complex signals at different time can be calcu-lated:

gðx; y; zðtÞÞ ¼ A1rðx; y; tÞ expfi½2pf 0xþ fðx; y; tÞ�g. (3)

The same operations are applied to the fringe pattern on thereference plane to obtain the complex signal of the referenceplane:

g0ðx; yÞ ¼ A1r0ðx; yÞ expfi½2pf 0xþf0ðx; yÞ�g. (4)

Noting the geometry relationship between the two similartriangles, DEpHEc and DCHD, in Fig. 1, we can write

CD ¼�dZðx; y; tÞ

l0 � Zðx; y; tÞ. (5)

The phase variation resulted from the object height distributionis

Dfðx; y; tÞ ¼ fðx; y; tÞ �f0ðx; yÞ ¼ 2pf 0ðBD� BCÞ ¼ 2pf 0CD. (6)

Substituting Eq. (5) into Eq. (6) and solving it for Z(x,y,t), theformula of height distribution can be obtained:

Zðx; y; tÞ ¼l0DFðx; y; tÞ

DFðx; y; tÞ � 2pf 0d(7)

where DF(x,y,t) is the unwrapped phase distribution of Df(x,y,t).The measurable slope of the height variation is [34]

@Zðx; yÞ

@x

����

����max

ol03d

. (8)

It means that FTP can only be used for the surfaces on which theslopes do not exceed this limitation. When the measurable slopeof height variation extends this limitation, the fundamentalcomponent will overlap the other components, then the recon-struction will fail.

In practice, to determine the height of each point of the object,the dual-direction nonlinear phase-to-height mapping techniqueis usually adopted, detailed description in Ref. [67]. Generally, therelation between phase and the height Z(x,y,t) can be written as

1

Zðx; y; tÞ¼ aðx; yÞ þ

bðx; yÞ

DFrðx; y; tÞþ

cðx; yÞ

DF2r ðx; y; tÞ

, (9)

where for a sampling instant t, DFr(x,y,t) ¼ F(x,y,t)�Fr(x,y) is thephase difference between the two unwrapped phase distributions,F(x,y,t) for the measured object and Fr(x,y) for the referenceplane. Z(x,y,t) is the relative height from the reference plane,a(x,y), b(x,y) and c(x,y) are the mapping parameters that can becalculated from the continuous phase distributions of four ormore standard planes with known heights. The height distribu-tion of the measured object at each sampling instant will beobtained by Eq. (9), as long as its 3-D phase distribution has beenunwrapped.

3. 3-D phase calculation and unwrapping

3.1. 3-D wrapped phase calculation

For convenient discussion, the fringe pattern on the referenceplane is chosen as a special deformed fringe which is obtained att ¼ 0, so we can rewrite Eq. (4) as

g0ðx; yÞ ¼ gðx; y; zðt ¼ 0ÞÞ ¼ gðx; y; zð0ÞÞ

¼ A1rðx; y;0Þ expfi½2pf 0xþfðx; y;0Þ�g. (10)

There are two methods to obtain the phase variationDf(x,y,t) ¼ f(x,y,t)�f0(x,y) resulted from height variation indifference time. Details will be discussed below.

3.1.1. Direct phase algorithm

Calculating directly the multiplication of the conjugation ofgðx; y; z ðtÞÞ with g

�ðx; y; z ð0ÞÞ in each 2-D space at different

sampling time, we obtain

gðx; y; zðtÞÞg�ðx; y; zð0ÞÞ ¼ jA1j

2rðx; y;0Þrðx; y; tÞ exp½iDfðx; y; tÞ�. (11)

The phase distribution Df(x,y,t) can be calculated by

Dfðx; y; tÞ ¼ fðx; y; tÞ � fðx; y;0Þ ¼ arctanIm½gðx; y; zðtÞÞg

nðx; y; zð0ÞÞ�

Re½gðx; y; zðtÞÞg�ðx; y; zð0ÞÞ�

,

(12)

where Im and Re represent the imaginary part and real part ofgðx; y; z ðtÞÞg

�ðx; y; z ð0ÞÞ, respectively. By this method, we can obtain

a sequence of phase distributions contained in each deformedfringe, which include the fluctuation information of dynamicobject.

3.1.2. Phase difference algorithm

By calculating the product of every two frame complex signalsg�ðx; y; z ðt � 1ÞÞ with gðx; y; z ðtÞÞ in neighboring time, we obtain

gðx; y; zðtÞÞgnðx; y; zðt � 1ÞÞ ¼ jA1j

2rðx; y; tÞrðx; y; t � 1Þ exp½iDftðx; yÞ�,

(13)

Page 4: Dynamic 3-Dshapemeasurementmethod a Review

ARTICLE IN PRESS

X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204194

where

Dftðx; yÞ ¼ fðx; y; tÞ �fðx; y; t � 1Þ ¼ arctanIm½gðx; y; zðtÞÞg

nðx; y; zðt � 1ÞÞ�

Re½gðx; y; zðtÞÞgnðx; y; zðt � 1ÞÞ�

.

(14)

Summing all the Dft(x,y) from t ¼ 0 to t ¼ m�1, we get thephase difference between the time of t and 0:

Dfðx; y; tÞ ¼Xt

n¼1

Dfnðx; yÞ ¼ fðx; y; tÞ � fðx; y;0Þ. (15)

In evidence, the phase difference f(x,y,t)�f(x,y,t�1) is alwayssmaller than f(x,y,t)�f(x,y,0), so it could lead to an easy phaseunwrapping process in 3-D space. More details can be found inSection 3.2.4.

3.1.3. 3-D Fourier fringe analysis [68]

The above mentioned processes deal with 3-D data (time-sequence fringes) on an individual 2-D frame-by-frame basis. Onthe other hand, the fringe analysis can also be done in a single 3-Dvolume not as a set of individual 2-D frames that are processed inisolation. 3-D FFT and 3-D filtering in the frequency domain toobtain the 3-D wrapped phase volume is equivalent in perfor-mance to 2-D fringe analysis.

i+1

i

3.2. 3-D phase unwrapping

Since the phase calculation by any inverse trigonometricfunction (i.e., arctangents) provides the principal phase valuesranging from –p to p consequently, phase values have disconti-nuities with 2p phase jumps. For phase data without noise, thesediscontinuities can be easily corrected by adding or subtracting 2paccording to the phase jump ranging from �p to p or vice versa.This is called the phase unwrapping procedure [69–73].

3-D phase unwrapping is not only conducted along x- andy-directions, but also along t-direction necessarily, so it canprovide more choices of the path for the unwrapping process.Some points of discontinuity, which are resulted from noise,shadow, under-sampling and cannot be unwrapped along x- ory-direction in its own frame, can be unwrapped successfully alongt-direction. So compared with 2-D unwrapping [69–73], 3-Dphase unwrapping is easier and more accurate.

i-1

S

Fig. 2. Sketch map of 3-D diamond phase unwrapping.

high

low

m

i+1

i

i-1

Fig. 3. Sketch map of 3-D phase unwrapping based on modulation ordering.

3.2.1. Direct 3-D phase unwrapping

If the wrapped phase is reliable everywhere and the phasedifference is less than p between the neighboring pixels in a 3-Dphase space, the unwrapping problem is trivial. In the 3-D phasespace, a direct phase unwrapping can be carried out along anydirection.

First, we calculate the wrapped phase-maps at different time,and select any pixel at initial time as a starting point to beunwrapped. A 1-D phase unwrapping procedure is carried outalong the t-direction. Afterwards, a column from the unwrappedpixel in every phase map is selected to be unwrapped. Finally, allthe rows from the initial column in every phase map areunwrapped, and natural phases in the whole 3-D space areobtained. Of course, the column and row operations can beinterchanged. When the time interval between the tth framefringe image and initial one exceeds a certain threshold, thewrapped phase with respect to that time is not reliable every-where, and the phase difference between the neighboring pixelsmay be more than p. In this condition, we cannot obtain a correctsurface by this unwrapping method.

3.2.2. 3-D diamond phase unwrapping

Definitely, the 2-D phase unwrapping based on the adjacentpixels (the unwrapping path is diamond-like) can be expanded todeal with the 3-D wrapped phase. If the wrapped phase is reliableeverywhere or all the branch-cuts have been placed and all theunreliable points (residues) have been covered by a 3-D mask, the3-D phase unwrapping process is able to be transferred fromthe start point to the adjacent 6 pixels in 3-D space. In thiscondition, the path of the phase unwrapping process is fastspreading like an inflating regular octahedron, shown as Fig. 2.This method is timesaving and more feasible.

3.2.3. 3-D phase unwrapping based on modulation ordering [60]

In an actual measurement, many factors, such as noise, fringebreak, shadow, under-sampling resulting from very high-fringedensity, under-modulation from very sparse fringe density, makethe actual unwrapping procedure complicated and path depen-dent. Combined with a modulation analysis technique, the 2-Dphase unwrapping method based on modulation ordering hasbeen well discussed in Refs. [36,69].

Now we extend the 2-D phase unwrapping procedure based onmodulation as a reliability function to 3-D phase space. Theunwrapping path is always along the direction from the pixel witha higher modulation to the pixel with lower modulation until theentire 3-D wrapped phase is unwrapped. The unwrapping schemeis as shown in Fig. 3, the lines with arrows display one of thephase unwrapping paths based on modulation ordering.

3.2.4. 3-D phase unwrapping based on a phase difference algorithm

When the frame rate of the frame grabber is high enough, thetime interval between two grabbed frame fringe patterns is small,so the phase difference between two neighboring pixels, whichhave the same image coordinates, is less than p in t-direction. Weselect only one phase map (assuming its number to be S), whichcan be unwrapped by 2-D phase unwrapping procedure easily andaccurately, and its natural phase is Fuws(x,y,S). The unwrappedphase at any time (t ¼ K) can be obtained by calculating the phasesum from t ¼ S to t ¼ K along t-direction. We can describe it as

Page 5: Dynamic 3-Dshapemeasurementmethod a Review

ARTICLE IN PRESS

A

B

CD

An

Bn

CnDn

A1

B1

C1D1

A2

B2

C2D2

Deformed wrapped

phases

Reference unwrapped

phase

Fig. 6. Sketch map for phase unwrapping of a separated fringe.

X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204 195

follow:

FuwK ðx; y;KÞ ¼ FuwSðx; y; SÞ þXK

i¼Sþ1

Dfiðx; yÞ, (16)

where FuwK(x,y,K) stands for the unwrapped phase map of t ¼ K,Dfi(x,y) represents the phase difference between two frames t ¼ i

and t ¼ i�1. The schematic diagram of 3-D phase unwrappingbased on phase difference algorithm is shown in Fig. 4.

The advantage of this method is that only one frame wrappedphase with reliable phase everywhere needs to be unwrapped,and the unwrapping procedure is very simple under thiscondition. So the precise phase values in the whole 3-D phasefield can be obtained by calculating the sum along t-direction.

3.2.5. Marked/coded fringe pattern for unwrapping phase of

spatially isolated dynamic objects

When the measured dynamic object is spatially isolated orbreaking into several isolated parts (e.g., an impact process), itwill bring some difficulties for phase unwrapping. We embeddeda special mark into the projected sinusoidal gratings to identifythe fringe order [74]. It is shown in Fig. 5. The orientation of theFourier spectra of the mark is perpendicular to that of the fringe,therefore the mark will not affect the Fourier spectra of thedeformed fringe and could be extracted easily with a proper band-pass filter in the other direction.

The phase value on the same marked strip is equivalent andknown, such as point B and D in Fig. 6, so these phase values atmarked strip keep the relation of those separated fringes. Thephase unwrapping process of each broken parts will be done fromthe local marked strip respectively.

For the same purpose, Wei-huang Su [75] made a structuredpattern in which the sinusoidal fringes are both encoded withbinary stripes and colorful grids. The binary stripes play a key roleto identify the local fringe order, while the colorful grid providesadditional degree of freedom to identify the stripes. Even though

Fig. 4. Sketch map of 3-D phase unwrapping base on phase difference algorithm.

Fig. 5. Marked fringes image.

the inspected surfaces are colorful, the proposed encoding schemecan still identify the local fringe orders. This encoding schemeprovides a more reliable performance to identify the fringe orderssince it is not sensitive to the observed colors.

In Guo and Huang’s research work [76,77], a small cross-shaped marker is embedded in the fringe pattern to facilitate theretrieval of the absolute phase map from a single fringe pattern,which could be used with the Fourier transform method toimprove its coordinate measurement accuracy while preservingits potential measurement speed.

4. Dynamic 3-D shape measurement methods and application

4.1. 3-D shape and deformation measurement of rotating object

[78,79]

Real-time measurement of the periodic motion like rotationand vibration is important for the studies of material properties,shape deformation, and tension analysis. However, real-time 3-Dshape measurement is difficult due to the requirement of themultiple measurements of most optical 3-D profilometry meth-ods. Even FTP has not been successful due to the difficulties ofobjects tracking and position detection.

The stroboscope [80] is an intense, high-speed light sourceused for the visual analysis of the objects in periodic motion andfor high-speed photography [81–83]. The objects in rapid periodicmotion can be studied by using the stroboscope to produce anoptical illusion of stopped or slowed motion. When the flashrepetition rate of the stroboscope is exactly the same as the objectmovement frequency, or an integral multiple thereof, the movingobject will appear to be stationary. It is called ‘‘stroboscopiceffect’’.

The combination of stroboscopic effect and FTP can solve theproblem of this requirement. In Ref. [76], we used a stroboscopicstructured illumination for the surface shape determination ofrotating blade. The deformation of a rotating blade measured ineach cycle has been recovered by this method.

4.2. Stroboscopic structured illumination system and control

Synchronization is very important for the snap shot measure-ment of high-speed motion. To ‘‘freeze’’ the motion and record aninstantaneous and stationary image of a rotating object, the flash

Page 6: Dynamic 3-Dshapemeasurementmethod a Review

ARTICLE IN PRESS

D/A VCO

Counter

Wave Transform

Trigger Circuit

LED Flash

Frequency Division Step

Compiler

Image GrabberPositionDetector

CoincidenceCircuit

ResetPC Bus

PC Bus

Fig. 7. Block diagram of sync control unit.

Stroboflash Projector

FineAdjustment

Fan

Detector

CCD

Computer

Stest

Stri

Vtri

Grating

Frame Grabber

Sync Control Unit

Fig. 8. Schematic diagram of experimental setup for measuring 3-D shape of a

rotating object.

X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204196

repetition rate of stroboflash and the shoot of CCD camera must beprecisely synchronized with the periodic rotation.

The block diagram of the stroboflash sync control unit is shownin Fig. 7. The unit has two control modes: interior trigger andexterior trigger. Interior trigger mode triggers the LED to flash andthe frame grabber to capture an image according to a frequencyset artificially through software. The current frequency rangecovers from 0 Hz to 20 MHz. This mode is used in study of motionwithout obviously repetition, such as expansion, contraction orballistic flight, and dynamic objects with known frequencies.Exterior trigger mode triggers the LED to flash and the framegrabber to capture an image according to the frequency signals ofthe position detector output. The position detector tracks themotion of dynamic objects. In this mode, the system will workautomatically without any artificial interference. This mode willbe used in real-time measurement for the dynamic objects withobviously repetition, such as rotation and vibration.

When the control unit is working in exterior trigger mode, theoutput signal of position detector will trigger the framer grabberto start a new capture, trigger the LED flash and reset the CCDafter a negligible lag (to ensure that the capture of frame grabberis prior to the video signal output).

4.3. Dynamic 3-D shape measurement of rotating object [78]

A schematic diagram of the experimental setup used for themeasurement of rotating object is given in Fig. 8. The image of asinusoidal grating of 2 lines/mm is projected and focused onto arotating blade surface. The deformed grating image is observed by aCCD camera (PULNiX-6AS) with a 16 mm focal length lens, via avideo-frame grabber, which has the function of external trigger,digitizing the image in 512�512 pixels. The fan is located at adistance l0 ¼ 625 mm from the projection system exit pupil andd ¼ 650 mm.

In practice, we started to capture the deformed fringe imagesat the instant of the commencement of the fan’s rotation. Fromquiescence to stable rotation (about 1112 rpm, revolutions perminute) and with one image for each revolution, 100 frameimages of the fan blade were totally obtained and two of them areshown in Figs. 9(a) and (b). The corresponding revolutionnumbers are 1st and 50th.

Fig. 9(c) gives the grid chart for the reconstructed shapeinformation at 50th revolution. Fig. 9(d) shows the deformation(relative to the first revolution) of the whole reconstructed bladeat the 50th revolution. Figs. 9(e) and (f) show the deformations(relative to the first revolution also) of the reconstructed blades atdifference revolutions along the line shown in each inset, thenumbers marked on left side of curves are their correspondingrevolutions. They depicted the twisting deformations of bladewith increasing speed distinctly. The farther the distance awayfrom the rotating shaft is, the bigger the deformation is. We cansee the maximal deformation is in excess of 8 mm.

Except high-speed motion object with obvious and variablerepetition, such as rotation and vibration, this method can also beused in study of high-speed motion without obvious repetition,such as expansion, contraction or ballistic flight, and high-speedobjects with known and invariable frequencies. This method canalso be expanded using different detectors or sensors. Forexample, explosion phenomena with sound signals can be studiedusing a sound control unit instead of the optical position detector.

4.4. High-speed optical measurement for the vibrated shape [84,85]

The drum plays an important role in many musical styles.Understanding the characteristics of the vibration of the mem-brane of the drum is important in studying the acousticcharacteristic and improving the manufacture technique of thedrum. The drumhead has been studied to a certain extent usingvarious methods. Bertsch [17] employs laser interferometry, apolytec scanning vibrometry, to measure the vibration patterns ofthe Viennese timpani by quickly scanning 110 points on themembrane. His method is the non-contact measurement techni-que and can achieve high accuracy, about nanometers, in vibrationamplitudes measurement.

A schematic diagram of the experimental setup used forvibrating drum measurement is shown in Fig. 10. The measuredobject is a Chinese double-side drum with a diameter of 250 mm,shown in Fig. 11. An image of a sinusoidal grating, 2 lines/mm, isprojected and focused onto the drum membrane, which is locatedat a distance of 780 mm (l0) from the exit pupil of the projectionsystem. The deformed grating image is observed by a high framerate CCD camera (SpeedCam Visario, the sampling rate speed is upto 10,000 fps.) with a zoom lens (Sigma Zoom, F82 mm, 1:28 DG,

Page 7: Dynamic 3-Dshapemeasurementmethod a Review

ARTICLE IN PRESS

Fig. 9. Dynamic 3-D shape measurement for rotating blade. (a) and (b) Two deformed fringes recorded at 1st, 50th revolutions; (c) the reconstructed shape at 50th

revolution; (d) the deformation (relative to the first revolution) at 50th revolution; (e) and (f) the deformations (relative to the first revolution) along the line shown in

inset.

High-speed CameraDrum Host of

Camera

Projector

Drumstick

Computer

Fig. 10. Schematic diagram of experimental setup.

X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204 197

24–70 mm). The distance between the projection system and thehigh-speed camera is 960 mm (d).

We hit the batter side of the drumhead three times quicklywith a wood drumstick and measured the whole vibration of theresonant side. During the whole vibration process, we hadrecorded near 1000 fringe images of vibrating membrane in900�900 pixels with 1000 fps in sampling rate speed. One of the

Fig. 11. The measured drumhead.

Page 8: Dynamic 3-Dshapemeasurementmethod a Review

ARTICLE IN PRESS

Fig. 12. Dynamic 3-D shape measurement for vibrating drum. (a) The deformed fringe. (b) Fourier spectra of (a). (c) Position of the center point of drumhead. (d) Restored

profiles of six sampling instant in one period. (e) and (f) Grid charts of restored height distribution of vibrating drumhead.

X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204198

deformed fringes and its Fourier spectra is shown in Figs. 12(a)and (b), respectively.

Fig. 12(c) shows the vibration position of the center point ofdrumhead form which we can see the time of hit and the dampedmotion of the drumhead. The height distributions of the vibratingdrumhead at their corresponding sampling instants are exactlyrestored. The maximum amplitude of this vibration is �1.79 to1.84 mm. Fig. 12(d) gives the profiles of the center row of sixsampling instants in one period; the number above each line isthe corresponding sampling instant.

Figs. 12(e) and (f) show the grid charts of two samplinginstants, and their sampling instants are given respectively as itstitles. Observing them, we will find that there are two modes, (1,0)and (1,1), in one period of the principal vibration of the drumhead.It indicates that the point we hit is not the exact center of thedrumhead. Furthermore, there are nonlinear effects (such as theuniform surface tension across the entire drumhead), which exert

their own preference for certain modes by transferring energyfrom one vibration mode to another.

This method has the advantage of high-speed full-field dataacquisition and analysis, so its application can be expanded intoother fields, such as dynamic 3-D shape measurement andanalysis of the instantaneous phenomenon, such as inflation anddeflation.

4.5. Dynamic measurement of vortex shape [86,87]

Another experience for measuring the dynamic object isdigitizing the motion of vortex when the poster paint surface isstirring. The deformed grating image is observed by a low-distortion TV CCD camera (TM560, 25 fps) with a 12 mm focallength lens, via a video-frame grabber digitizing the image in

Page 9: Dynamic 3-Dshapemeasurementmethod a Review

ARTICLE IN PRESS

0-0.5

-1-1.5

-2-2.5

-3-3.5

-4

0 20 40 60 80 100 120X /mm

Hei

ght/m

m

Fig. 13. Dynamic measurement of vortex shape. (a) One of the deformed fringes when the stirrer is working and (b) profiles of the reconstructed vortices at different times.

Fig. 14. Six frame deformed fringes images.

X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204 199

128�128 pixels. The paint is located at a distance l0 ¼ 890 mmfrom the projection system exit pupil, and d ¼ 290 mm.

In this experience, we captured a fringe pattern as thereference plane while the paint is still. Then, we began to catchthe deformed fringe images at the very moment when the stirrerwas turned on. From quiescence to stable rotation, 205 frameimages of the paint we got in all in 10.4 s and one of them shownas in Fig. 13(a). Fig. 13(b) shows the profiles of the reconstructedvortices, the number under each line is the correspondingsampling instant. It illuminates the formation and evolution ofthe vortex clearly.

4.6. Dynamic 3-D shape measurement for breaking object [74]

Another application is to test a breaking ceramic tile. Obviouslythe tile is divided into 4 spatial-isolated blocks and the fringediscontinuity is followed. The introduced marked fringe, whichhas been detailed in Section 3.2.5, has been used to ensure thephase unwrapping process error free. The CCD camera isLumenera Camera Lu050 M (320�240 pixels, with a 12 mm lensand 200 fps in sample rate speed). The whole breaking processcontains 47 frames and lasts 235 ms. Six of them are shown inFig. 14, their corresponding sampling instants are 745, 750 755,

765, 775 and 785 ms, respectively (from the beginning of capture).Fig. 15 shows the reconstructed 3-D shape of the breaking tile atthe corresponding sampling instants in Fig. 14.

4.7. Dynamic 3-D shape measurement for face in chewing [88,89]

We designed a low-cost measurement system and used it toreconstruct the 3-D facial shape and movement in chewing. Theprojected image was a sinusoidal grating of 3 lines/mm, and theCCD camera was a commercial one (MINTRON MTV 1881EX) witha 50 mm focal length lens. The image was digitizing in 512�512pixels. The head was located at a distance l0 ¼ 2220 mm from theprojection system exit pupil, and d ¼ 590 mm. One hundred andseventeen frame images of the facial movement in 4.68 s had beenrecorded at the video rate (25 fps). One frame of image is shown asin Fig. 16(a), and the area inside the white dashed frame is to bereconstructed. Fig. 16(b) draws the contour chart of thereconstructed facial shape when t ¼ 1.04 s (it correspondingframe number is 26th). Fig. 16(c) gives the grid chart of thereconstructed facial shape at the same time shown in Fig. 16(b).These digital facial shapes will be used in biomedicine forcosmetology and orthopedics or in security system for facerecognition.

Page 10: Dynamic 3-Dshapemeasurementmethod a Review

ARTICLE IN PRESS

10

0

-10Hei

ght/m

m

360 300 240 180 120 60 0240180120600

y/pi

xels

x/pixels

10

0

-10Hei

ght/m

m

360 300 240 180 120 60 0240180120600

y/pi

xels

x/pixels

10

0

-10Hei

ght/m

m

360 300 240 180 120 60 0240180120600

y/pi

xels

x/pixels

10

0

-10Hei

ght/m

m

360 300 240 180 120 60 0240180120600

y/pi

xels

x/pixels

10

0

-10Hei

ght/m

m

360 300 240 180 120 60 0240180120600

y/pi

xels

x/pixels

10

0

-10Hei

ght/m

m

360 300 240 180 120 60 0240180120600

y/pi

xels

x/pixels

Fig. 15. Six grid charts of restored shape of breaking tile.

One Frame of Deformed Fringe50045040035030025020015010050

Y/p

ixel

s

50 100

150

200

250

300

350

400

450

500

X/pixels

Reconstructed Facial Shape andMovement~Frame #26~

Fig. 16. Dynamic 3-D measurement for facial shape and movement. (a) One frame of deformed fringe; (b) contour chart of frame at t ¼ 1.04 s and (c) grid chart of

reconstructed shape of face.

X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204200

4.8. Dynamic 3-D shape measurement for woofer’s vibration [90]

Experiment was performed on a woofer speaker in hopes ofestimating whether the woofer would be a very effective radiatorof sound or not and determining whether the speaker performsthe expected way or not. The projected grating was 2 lines/mm.The images were captured in 630� 630 pixels with 800 fps.A function generator and a power amplifier were used to create asine wave, which transferred to the measured woofer andsynchronously captured by an oscillograph as the input signal,and a high-speed camera (Basler 504 k with a 16 mm focal lengthlens) was employed to record the deformed fringe patternsresulted by the cone’s vibration. The reconstructed vibration hadbeen taken as the output of the woofer and compared with theinput obtained by oscillograph.

The measured woofer’s cone was compelled to vibrate by a 5 vsine wave. Five hundred frame deformed fringe images of thevibrating woofer were obtained in 625 ms and one of them isshown in Fig. 17(a). The height distributions of the vibratingwoofer at the corresponding sampling instants are exactlyrestored and shown in Fig. 17(b). Fig. 17(c) shows the positionsof the center point of the woofer in 50 ms vibration (40 frames)

and indicates each sampling instant using an asterisk (*). Fromthis figure, we can clearly see that the motion of one point on thewoofer’s cone is also a sine wave and we can work out theprincipal vibration frequency of this woofer is near 50 Hz (thereare two and half periods in 50 ms), which exactly correspondswith the input signal. Fig. 17(d) gives the profiles of the centerrows of nine sampling instants in one period; the number (unit:ms) labeled near each line is the corresponding sampling instantrespectively. The vibration range (the amplitude displacement ofthe speaker cone, pumping in and pumping out) is from �1.25 to1.14 mm.

4.9. Dynamic 3-D shape measurement for impact [91]

Another experience for measuring was carried on a thin plateimpacted by a flying rammer. The projected grating was 2 lines/mm. The thin plate was fixed on the central hole of a stabilitymount plate, and located at a distance l0 ¼ 550 mm from theprojection system exit pupil, and d ¼ 360 mm. The deformedgrating image (180�180 pixels) was observed by a high framerate CCD camera (Lumenera Lu050 M, 200 fps) with a 12 mm focal

Page 11: Dynamic 3-Dshapemeasurementmethod a Review

ARTICLE IN PRESS

44.9629.9714.99

0235.6

191.5143.6

95.747.9

0 235.6191.5

143.695.7

47.90

Y/mm

X/mm

Hei

ght/m

m

1

1.111

0.5

0

-0.5

-1-1.21

0 5 10 15 20 25 30 35 40 45 50Time/ms

Vibr

atio

n ra

nge/

mm

1.141

0.5

0

-0.5

-1-1.25

0 47.1 94.3 141.4 188.5 235.6Position/mm

Vibr

atio

n ra

nge/

mm

7.58.75

11.25

3.75

1.2515

2018.75

17.5

Fig. 17. Dynamic 3-D shape measurement for vibrating woofer. (a) One deformed fringe; (b) the reconstructed shape; (c) positions of center point in vibration and (d)

profiles of center rows of nine sampling instants in one period.

Fig. 18. Three frames of deformed fringes at difference time.

Fig. 19. 3-D reconstructed height distributions of impacting object’s surface.

X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204 201

length lens. We employed a flying rammer who tightly bound atthe end of a string to horizontally impact the thin plate. Therammer was flying towards the thin plate at the instant ofcommencement of capturing image. From beginning to ending ofcapture, 200 frame deformed fringe images had been obtained inone second. In afterwards image processing, we found that the

front part, near 70 frames, were obtained before impaction andthe rear ones, near 70 frames, were obtained after impactioncompletely. That it to say, the whole impaction lasted 0.3 s only.Three of the 200 frame deformed fringes are shown in Fig. 18.Their frame sequence numbers are 86, 101 and 130 (theircorresponding time is 430, 505 and 650 ms after capturing).

Page 12: Dynamic 3-Dshapemeasurementmethod a Review

ARTICLE IN PRESS

X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204202

Fig. 19 gives the grid chart for the reconstructed heightdistributions of the corresponding frames shown in Fig. 18.

5. Real-time 3-D TV camera [62–66]

Among the several methods used for 3-D surface measure-ment, triangulation is the most common method of detectingdepth information. The systems based on triangulation use eitherstereoscopic images taken by several cameras [62] or structuredlight projections [63]. Sometimes, these triangulation methodsare limited by shadow. The time-of-flight (TOF) method has theadvantage of quick, straightforward information processing [64].The conventional system using this TOF method, however,requires two-dimensional laser beams scanning to make a depthimage of an object, so it is only suitable for stationary objects.

Recently, a high-definition real-time depth-mapping TV cam-era was proposed, which was named as the HDTV Axi-visioncamera [65,66]. The camera achieves simultaneous 2-D colorimaging with high resolution and depth sensing. The depthinformation can be calculated in real time.

The principle of depth mapping is shown as Fig. 20 [66]. Theintensity of an ultra-fast snapshot of an object becomesdependent on the distance from the camera to the object if theintensity of the illuminating light is varied at a speed comparableto the speed of light. In Fig. 20, the upper section shows thearrangement of the components, with the infrared intensity-modulated LED illuminating sources on the left and a CCD camerawith an ultra-fast shutter utilizing an image intensifier on theright. The lower section of the figure shows a time diagram of theilluminating and reflect light during the first three video-framecycles. The solid triangle formed line indicates the triangularlyintensity-modulated illuminating LED light at the source. Thedotted triangle formed line indicates the light intensity at the

Intensity-modulatedinfrared light

P

LED lightsource

PhotocMic

Uus

T = 20 ns to 1

Video frame 3

Video frame 2

Video frame 1(1/30s or 1/60s)

d

Illuminatinglight

ts�t

Fig. 20. Principle of acquiring depth information by using intensity-modulated illumin

camera after reflection from one particular point P on the object.The triangle formed by the dotted line is delayed by Dt ¼ 2d/vfrom that formed by the solid line because of the time required forthe light to make the round trip, where d is the distance from thecamera to point P and v is the velocity of light. The pair of verticalblack dotted lines indicates the duration when the ultra-fastshutter in front of the CCD camera is open.

If the shutter opens during the ascending intensity modulation,as shown in first video frame, the input light to the CCD cameraduring the exposure time (represented by the hatched section inFig. 20) decreases with the distance d to the object, because thedotted line shifts to the right. During this cycle of illumination,smaller input light to the CCD camera means longer distance tothe object. On the other hand, if the shutter opens during thedescending intensity modulation, as shown in the second videoframe, larger input light to the CCD camera means longer distance.

The 3-D TV camera systems can detect 3-D information,including a 2-D color imaging with high-resolution and a depth-mapping imaging. They have a wide range of applications in TVprogram production, 3-D modeling, and robotic vision, as well asgraphics animation.

6. Conclusion

There are increasing demands for obtaining the 3-D informa-tion, moreover it is expected that the trend is to expand theexisted techniques to deal with the dynamic process. As a non-contact shape measurement technique, Fourier transform profi-lometry has been well introduced to deal with the dynamicprocess in recent years, due to its merits of only one needed fringepattern, full-field analysis and high precision.

In this paper, we mainly review our research work based onthis method in the past ten years. The basic principles of this

athoderochannel plate

Phosphor

CCDSignal

processorDepthimage

ltra-fast shuttering image intensifier

Reflected light

Image A

Image B

Image A

t

Depthimage 1

Depthimage 2

Depthimage 3

t

t

00 ns

ation and an ultra-fast camera shutter using an image intensifier (from Ref. [66]).

Page 13: Dynamic 3-Dshapemeasurementmethod a Review

ARTICLE IN PRESS

X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204 203

method and a variety of phase unwrapping techniques are alsointroduced. We also list a number of typical applications such asthe combination of stroboscopic effect and FTP to test the objectsin rapid periodic motion, high-speed optical measurement for thevibrating membrane of the drum, dynamic measurement ofrotating vortex, breaking tile, chewing face, vibrating woofer andimpacted plate, etc.

Although this method has been well studied and applied indifferent application field, there are still some challenges whichare necessary to be addressed in future work. (1) Real-time 3-Dshape measurement, which is the key for successfully implement-ing 3-D coordinate display and measurement, manufacturingcontrol and on line quality inspection, is more difficult in dynamicprocess. Right now the data processing in dynamic process is justexecuted after the event; therefore there is a long road to realizethe real-time 3-D dynamic shape measurement. (2) Due to thevarying object, the accuracy of dynamic 3-D shape measurementis usually less than that of static 3-D shape measurement.Generally the different experimental setup of each measurementwill lead to different accuracy. In the applications proposed in thispaper, the accuracy of this kind of method is up to decades ofmicrometers. For example, in vibrating drumhead measurement,the standard deviation of the restored height is 0.075 mm. Inrotating blade measurement, the standard deviation of the heightsis less than 0.022 mm. The primary task of the future work wouldbe to improve the accuracy of dynamic 3-D shape measurement tomeet the industrial application requirements. The entire accuracyof a measurement system is determined by the system parameterd/l0 and the grating period p0. It can be improved by increasing thed/l0 or decreasing the p0 according to the maximum variationalrange of the measured height. (3) The requirements of theenvironment for dynamic measurement are tougher than static3-D shape measurement, for example, the influence of the varyingenvironmental light in stroboscope, the influence of the un-expected motion of the peripheral equipments. (4) There is also alack of research activity for overcoming the shading probleminherent in the grating-projection based techniques.

This 3-D shape dynamic measurement has the advantage ofhigh-speed full-field data acquisition and analysis, With thedevelopment of high-resolution CCD cameras and high framerate frame grabbers, the method, as proposed here, should be apromising one in studying high-speed motion including rotation,vibration, explosion, expansion, contraction and even shock wave.

Acknowledgement

This project was supported by the National Natural ScienceFoundation of China (No. 60838002, 10876021, 60527001 and60807006).

References

[1] Chen F, Brown G M, Song M. Overview of three-dimensional shapemeasurement using optical methods. Opt Eng 2000;39(1):10–22.

[2] Jahne B, Haußecker H, Geißler P. Handbook of computer vision andapplications, volume 1: sensors and imaging. San Diego: Academic Press;1999 (Chapters 17–21).

[3] http://www.mova.com/.[4] Rae PJ, Goldrein HT, Bourne NK, et al. Measurement of dynamic large-strain

deformation maps using an automated fine grid technique. Opt Laser Eng1999;31:113–22.

[5] Sevenhuijsen PJ, Sirkis JS, Bremand F. Current trends in obtaining deformationdata from grids. Exp Tech 1993;17(3):22–6.

[6] Lopez CP, Santoyo FM, Rodrıguez VR, Funes-Gallanzi M. Separation ofvibration fringe data from rotation object fringes using pulsed ESPI. OptLaser Eng 2002;38(2):145–52.

[7] Servin M, Davila A, Quiroga JA. Extended-range temporal electronic specklepattern interferometry. Appl Opt 2002;41(22):4541–7.

[8] Fan H, Wang J, Tan YS. Simultaneous measurement of whole in-planedisplacement using phase-shifting ESPI. Opt Laser Eng 1997;28:249–57.

[9] Petitgrand S, Yahiaoui R, Danaie K, Bosseboeuf A, Gillers JP. 3-D measurementof micromechanical devices vibration mode shapes with a stroboscopicinterferometric microscope. Opt Laser Eng 2001;36(2):77–101.

[10] Maa CC, Huang CH. Experimental whole-field interferometry for transversevibration of plates. J Sound Vib 2004;271:493–506.

[11] Chen LC, Huang YT, Fan KC. A dynamic 3-D surface profilometer with nano-scale measurement resolution and MHz bandwidth for MEMS characteriza-tion. IEEE/ASME Trans Mechatron 2007;12(3):299–307.

[12] Chen LC, Huang YT, Nguyen XL, Chen JL, Chang CC. Dynamic out-of-planeprofilometry for nano-scale full-field characterization of MEMS usingstroboscopic interferometry with novel signal deconvolution algorithm. OptLaser Eng 2009;47:237–51.

[13] Guo T, Chang H, Chen JP, Fu X, Hu XT. Micro-motion analyzer used fordynamic MEMS characterization. Opt Laser Eng 2008.

[14] Brown GC, Pryputniewicz RJ. New test methodology for static and dynamicshape measurements of microelectromechanical systems. Opt Eng2000;39:127–36.

[15] Osten W, Juptner W, Seebacher S, Baumach T. The qualification of opticalmeasurement techniques for the investigation of materials parameters ofmicrocomponents. Proc SPIE 1999;3825:152–64.

[16] Nakagawa H, Kurita Y, OgawaK, et al. Experimental analysis of chattervibration in end-milling using laser Doppler vibrometers. J Autom Technol2008;2(6):431–8.

[17] Bertsch M. vibration patterns and sound analysis of the Viennese timpani.Proc ISMA 2001;2:281–4.

[18] Tian J, Peng X. 3-D digital imaging based on shifted point-array encoding. AppOpt 2005;44(26):5491–6.

[19] Watanabe Y, Komuro Y, Ishikawa M. 955-fps real-time shape measurement ofa moving/deforming object using high-speed vision for numerous-pointanalysis. IEEE Robot Autom 2007:3192–7.

[20] Cheng XX, Su XY, Gou LR. Automatic measurement method for 3601profilometry of 3-D diffuse objects. Appl Opt 1991;30(10):1274–8.

[21] Li JL, Su XY, Zhou WS. The 3-D sensing using laser sheet projection: influenceof speckle. Opt Rev 1995;2(2):144–9.

[22] Asundi A, Zhou WS. Unified calibration technique and its applications inoptical triangular profilometry. Appl Opt 1999;38(16):3556–61.

[23] Asundi A, Zhou WS. Mapping algorithm for 360-deg. profilometry with timedelayed integration imaging. Opt Eng 1999;38(2):339–44.

[24] Takasaki H. Generation of surface contours by moire pattern. Appl Opt1970;9(4):942–7.

[25] Yoshizawa T. The recent trend of moire metrology. J Robustic Mech1991;3(3):80–5.

[26] Wang B, Luo X, Pfeifer T, Mischo H. Moire deflectometry based on Fourier-transform analysis. Measurement 1999;25(4):249–53.

[27] Sajan MR, Tay CJ, Shang HM, Asundi A. TDI imaging and scanning moire foronline defect detection. Opt Laser Technol 1997;29(6):327–31.

[28] Srinivasan V, Liu HC, Halioua H. Automated phase-measuring profilometry of3-D diffuse objects. Appl Opt 1984;23(18):3105–8.

[29] Halioua M, Liu HC. Optical three-dimensional sensing by phase measurementprofilometry. Opt Laser Eng 1989;11:115–85.

[30] Su XY, Zhou WS, von Bally V, Vukicevic D. Automated phase-measuringprofilometry using defocused projection of a Ronchi grating. Opt Commun1992;94(6):561–73.

[31] Li J, Su HJ, Su XY. Two-frequency grating used in phase-measuringprofilometry. Appl Opt 1997;36(1):277–80.

[32] Zhang H, Lalor MJ, Burton DR. Spatio temporal phase unwrapping for themeasurement of discontinuous objects in dynamic fringe-projection phase-shifting profilometry. Appl Opt 1999;38(16):3534–41.

[33] Takeda M, Ina H, Kobayashi S. Fourier-transform method of fringe-patternanalysis for computer-based topography and interferometry. J Opt Soc Am1982;72(1):156–60.

[34] Takeda M, Motoh K. Fourier transform profilometry for the automaticmeasurement of 3-D object shapes. Appl Opt 1983;22(24):3977–82.

[35] Li J, Su XY, Guo LR. An improved Fourier transform profilometry for automaticmeasurement of 3-D object shapes. Opt Eng 1990;29(12):1439–44.

[36] Su XY, Chen WJ. Fourier transform profilomitry: a review. Opt Laser Eng2001;35(5):263–84.

[37] Su XY, Chen WJ, Zhang QC, Chao YP. Dynamic 3-D shape measurementmethod based on FTP. Opt Laser Eng 2001;36:49–64.

[38] Su XY, Su LK. New 3D profilometry based on modulation measurement. ProcSPIE 1998;3558:1–7.

[39] Su XY, Su LK, Li WS. A new Fourier transform profilometry based onmodulation measurement. Proc SPIE 1999;3749:438–9.

[40] Su LK, Su XY, Li WS. Application of modulation measurement profilometry toobjects with surface holes. Appl Opt 1999;38(7):1153–8.

[41] Toyooka S, Tominaga M. Spatial fringe scanning for optical phase measure-ment. Opt Commun 1984;51(2):68–70.

[42] Sajan MR, Tay CJ, Shang HM, et al. TDI imaging—a tool for profilometry andautomated visual inspection. Opt Lasers Eng 1998;29(6):403–11.

[43] Sajan MR, Tay CJ, Shang HM, Asundi A. Improved spatial phase detection forprofilometry using a TDI imager. Opt Commun 1998;150(1-6):66–70.

[44] Hausler G, Ritter D. Parallel three-dimension sensing by color-codedtriangulation. Appl Opt 1993;32(35):7164–9.

Page 14: Dynamic 3-Dshapemeasurementmethod a Review

ARTICLE IN PRESS

X. Su, Q. Zhang / Optics and Lasers in Engineering 48 (2010) 191–204204

[45] Zhang L, Curless B, Seitz SM. Rapid shape acquisition using color structuredlight and multi-pass dynamic programming. IEEE Int Symp 3D Data ProcessVisualization Transm 2002:24–36.

[46] Geng ZJ. Rainbow 3-D camera: new concept of high-speed three visionsystem. Opt Eng 1996;35:376–83.

[47] Xu ZQ, Ye SH, Fan GZ. Color 3D reverse engineering. J Mater Process Technol2002;129(1):495–9.

[48] Kawasaki H, Furukawa R, Sagawa R, Yagi Y. Dynamic scene shapereconstruction using a single structured light pattern, 2008. CVPR 2008. IEEECompute Vision Pattern Recogn 2008:1–8.

[49] Pages J, Salvi J, Matabosch C. Implementation of a robust coded structuredlight technique for dynamic 3D measurements. IEEE Image Process2003;3(2):1073–6.

[50] Besi PJ, et al. Active optical image sensors. In: Sanz JLC, editor. Advances inmachine vision. Berlin: Springer; 1989.

[51] Zhang L, Curless B, Seitz S. Spacetime stereo: shape recovery for dynamicsenses. IEEE Computer Vision Pattern Recogn 2003:367–74.

[52] Davis J, Ramamoorthi R, Rusinkiewicz S. Spacetime stereo: a unifyingframework for depth from triangulation. IEEE Trans Pattern Anal Mach Intell2005;27(2):1–7.

[53] Rodriguez-vera R, Servin M. Phase locked loop profilometry. Opt LaserTechnol 1994;26(6):393–8.

[54] Li NB, Peng X, Xi JT, Chicharo JF, Yao JQ, Zhang DW. Multi-frequency andmultiple phase shift sinusoidal fringe projection for 3D interferometry. OptExp 2005;13(5):1561–9.

[55] Zhang Z, Zhang D, Peng X. Performance analysis of 3-D full field sensor basedon fringe projection. Opt Laser Eng 2004;42:341–53.

[56] Quan C, Tay CJ, Huang HY. 3-D deformation measurement using fringeprojection and digital image correlation. Optik 2004;115(4):164–8.

[57] Cheng P, Hu J, Zhang G, Hou L, Xu B, Wu X. Deformation measurements ofdragonfly’s wings in free flight by using Windowed Fourier Transform. OptLaser Eng 2008;46:157–61.

[58] Huang PS, Zhang C, Chiang FP. High speed 3-D shape measurement based ondigital fringe projection. Opt Eng 2003;42(1):163–8.

[59] Zhang S, Huang PS. High-resolution, real-time three-dimensional shapemeasurement. Opt Eng 2006;45(12):123601–18.

[60] Zhang GJ, Ye SH. Online measurement of the size of standard wire sievesusing an optical Fourier transform. Opt Eng 2000;39(4):1098–102.

[61] Li Y, Nemes JA, Derdouri A. Optical 3-D dynamic measurement system and itsapplication to polymer membrane inflation tests. Opt Laser Eng2000;33:261–76.

[62] Kimura S, Kano H, Kanade T, Yoshida A, Kawamura E, Oda K. CMU video-ratestereo machine. Mobil Mapp Symp Photogrammetry Remote Sensing1995:9–18 (Available also from /http://citeseerx.ist.psu.edu/viewdoc/sum-mary?doi=10.1.1.33.3692S).

[63] Sato K, Inokuchi S. Range-imaging system utilizing nematic liquid crystalmask. ICCV 1987:657–61.

[64] Jarvis RA. A laser time-of-flight range scanner for robotic vision. IEEE TransPattern Anal Mach Intel PAMI-5 1983:505–12.

[65] Kawakita M, Iizuka K, Aida T, Kikuchi H, Fujikake H, Yonai J, et al. Axi-VisionCamera (real-time depth-mapping camera). Appl Opt 2000;39:3931–9.

[66] Kawakita M, Iizuka K, Aida T, Kikuchi H, Fujikake H, Yonai J, et al. High-definition real-time depth-mapping TV camera: HDTV Axi-Vision Camera.Opt Exp 2004;12(12):2781–94.

[67] Li WS, Su XY, Liu ZB. Large-scale three-dimensional object measurement: apractical coordinate mapping and imaging data-patching method. Appl Opt2001;40(20):3326–33.

[68] Abdul-Rahman HS, Gdeisat MA, Burton DR, Lalor MJ, Lilley F, Abid A. Three-dimensional Fourier fringe analysis. Opt Laser Eng 2008;46:446–55.

[69] Su XY, Chen WJ. Reliability-guided phase unwrapping algorithm: a review.Opt Laser Eng 2004;42(3):245–61.

[70] Judge TR, Bryyanston-Cross PJ. A review of phase unwrapping techniques infringe analysis. Opt Laser Eng 1994;21:199–239.

[71] Su XY. Phase unwrapping techniques for 3-D shape measurement. Proc SPIE1996;2866:460–5.

[72] Li JL, Su XY, Li JT. Phase unwrapping algorithm-based on reliability and edge-detection. Opt Eng 1997;36(6):1685–90.

[73] Asundi AK, Zhou WS. Fast phase-unwrapping algorithm based on a gray-scalemask and flood fill. Appl Opt 1999;38(16):3556–61.

[74] Xiao YS, Su XY, Zhang QC, Li ZR. 3-D profilometry for the impact process withmarked fringes tracking. Opto-Electron Eng 2007;34(8):46–52 (in Chinese).

[75] Su WH. Projected fringe profilometry using the area-encoded algorithm forspatially isolated and dynamic objects. Opt Exp 2008;16(4):2590–6.

[76] Guo H, Huang PS. 3-D shape measurement by use of a modified Fouriertransform method. Proc SPIE 2008;7066 (p. 70660E-1–8).

[77] Guo H, Huang PS. Absolute phase retrieval for 3D shape measurement by theFourier transform method. Proc SPIE 2007;6762:676110–204.

[78] Zhang QC, Su XY, Cao YP, Li Y, Xiang LQ, Chen WJ. An optical 3-D shape anddeformation measurement for rotating blades using stroboscopic structuredillumination. Opt Eng 2005;44(11):113601–17.

[79] Su XY, Zhang QC, Xiang LQ, Cao YP, Chen WJ. A stroboscopic structuredillumination system using in dynamic 3D visualization of high-speed motionobject. Proc SPIE 2005;5852:796–9.

[80] Edgerton HE. Stroboscope. /http://web.mit.edu/museum/exhibits/S.[81] Asundi AK, Sajan MR. Low-cost digital polariscope for dynamic photoelasti-

city. Opt Eng 1994;33(9):3052–5.[82] Asundi AK, Sajan MR. Digital drum camera for dynamic recording. Opt Eng

1996;35(6):1707–13.[83] Asundi AK, Sajan MR. Multiple LED camera for dynamic photoelasticity. Appl

Opt 1995;34(13):2236–40.[84] Zhang QC, Su XY. High-speed optical measurement for the drumhead

vibration. Opt Exp 2005;13(8):3310–6.[85] Su XY, Zhang QC, Li J, Li ZR. Optical 3D shape measurement for vibrating

drumhead. Proc SPIE 2006(6027):60271.[86] Zhang QC, Su XY. An optical measurement of vortex shape at a free surface.

Opt Laser Technol 2002;34(2):107–13.[87] Zhang QC, Su XY. Dynamic liquid surface measurement. Acta Optica Sinica

2001;21:1506–8 (in Chinese).[88] Zhang QC, Su XY, Chen WJ, Cao YP, Xiang LQ. An optical real-time 3-D

measurement for facial shape and movement. Proc SPIE 2003;5254:214–21.[89] Zhang QC, Su XY, Chen WJ, et al. An optical real-time 3-D measurement for

facial shape and movement in chewing. J Optoelectron Laser2004;15(2):194–9 (in Chinese).

[90] Zhang QC, Su XY, Xiang LQ. Whole-field vibration analysis of a woofer’s coneusing a high-speed camera. Proc SPIE 2007;6279:627900-1–5.

[91] Zhang QC, Su XY. Three-dimensional shape visualization of object in impact.Proc SPIE 2006;6027II:60273E.