of modal parameters using the bootstrap in bendat and piersol (1 986) ... measure data from a random...

6
STATISTICAL ANALYSIS OF MODAL PARAMETERS USING THE BOOTSTRAP Thomas L. Paez*, Norman F. HunteP Experimental Structural Dynamics Department * Engineering science a d dJsi&EJ Sandia NationalLaboratories Albuquerque, New Mexico Los Alamos NationalLaboratory Los Alamos, New Mexico ABSTRACT. Structural dynamic testing is concerned with the estimation of system properties, including frequency response functions and modal characteristics. These properties are derived from tests on the structureof interest, during which excitations and responses are measured and Fourier techniques are used to reduce the data zhe inputs used in a test are frequently random, and they excite random responses h the structure of interest. when these random inputs and responses are analyzed they yiekj estimates of system properties that are random variabk and random process reaikations. Of course, such estimates of system properties vary randomly from one test to another, but even when deterministic inputs are used to excite a structure, the estimated properties vary f m test to test. when test excitations and responses are normally distributed, classical techniques permit us to Statistically analyze ihputs, responses, and some system parametem However, when the input excitations are non-normal, the system is nonlinear, and/or the property of interest is anything but the simplest# the classical analyses break abwn. The bootstrap is a technique for the statistical analysis of data that are not necessarily normaliy distributed. It can be used to statistically analyze any measure of input excitation or response, or any system pmperty, when data are available to make an estimate. It is designed to estimate the standard error, bias, and confidence intervals of parameter estimates. This paper shows how the bootstrap can be applied to the statiskal analysis of modal parameters. NOMENCLATURE system parameters a cdf (cum. distrib. fn.) and its estimator matrix. vector of future values in CVA matrix, vector of past values in CVA a transformation matrix components of SVD a data sample a bootstrapdata sample a pdf (probability density function) dynamic system parameters optimum memory of past estimate of standard error system input noise x(t) structural response Y system response e statistic, parameter estimate 6. bootstrap replicate of statistic 1. INTRODUCTION AND MOTIVATION Analysis of structural test data follows the sequence shown in Fgure 1. spectral densities and transfer functions are derived using averages of input and response Fourier spectra. Modal frequencies, dampings, and mode shapes are identified by fitting linear oscillatory models to the observed spectral properties. Because of measurement noise, temporal and sample variation of system parameters, and nonlinearity of real systems, parameter estimates vary from one analysis to the next. It is desirable to establish the relative degree of variation of parameter estimates so that we can gage the accuracy of system characterizations. Statistical measures of accuracy like bias, standard error, and confidence intervals can be established for some spectral estimators like autospectral density, cross-spectral density and frequency response function. Other system characteristics like system mode frequencies, dampings and mode shapes are more diffiiult to analyze because the link between the underlying data and the modal measure of interest is quite complex. Some of the spectral estimators that can be analyzed using classical approaches are described in Bendat and Piersol (1 986) and Wirsching, Paez, and Ortiz (1995). These statistics are based on the assumptions of stationary, Gaussian data, and on rather complex statistical derivations. In this paper we explore the accuracy of modal parameter estimates from another viewpoint. We utilize a recently derived statistical technique called the bootstrap (Efron. 1979) to directly estimate the error bounds associated with the computation of modal frequencies. A recent paper (Hunter and Paez, 1995) shows how the bootstrap can be used to perform statistical analysis on spectral densities and transfer functions from test data. Assumptions of stationarity. linearity, or Gaussian'w are not required. We demonstrate the technique on the modal parameters of data from a multidegree-of-freedom structure.

Upload: phamkhuong

Post on 09-May-2018

222 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: OF MODAL PARAMETERS USING THE BOOTSTRAP in Bendat and Piersol (1 986) ... measure data from a random source and assume that the ... Probability density function of a random source

STATISTICAL ANALYSIS OF MODAL PARAMETERS USING THE BOOTSTRAP

Thomas L. Paez*, Norman F. HunteP

Experimental Structural Dynamics Department * Engineering science a d d J s i & E J Sandia National Laboratories Albuquerque, New Mexico

Los Alamos National Laboratory Los Alamos, New Mexico

ABSTRACT. Structural dynamic testing is concerned with the estimation of system properties, including frequency response functions and modal characteristics. These properties are derived from tests on the structure of interest, during which excitations and responses are measured and Fourier techniques are used to reduce the data zhe inputs used in a test are frequently random, and they excite random responses h the structure of interest. when these random inputs and responses are analyzed they yiekj estimates of system properties that are random variabk and random process reaikations. Of course, such estimates of system properties vary randomly from one test to another, but even when deterministic inputs are used to excite a structure, the estimated properties vary f m test to test. when test excitations and responses are normally distributed, classical techniques permit us to Statistically analyze ihputs, responses, and some system parametem However, when the input excitations are non-normal, the system is nonlinear, and/or the property of interest is anything but the simplest# the classical analyses break abwn. The bootstrap is a technique for the statistical analysis of data that are not necessarily normaliy distributed. It can be used to statistically analyze any measure of input excitation or response, or any system pmperty, when data are available to make an estimate. It is designed to estimate the standard error, bias, and confidence intervals of parameter estimates. This paper shows how the bootstrap can be applied to the statiskal analysis of modal parameters.

NOMENCLATURE

system parameters a cdf (cum. distrib. fn.) and its estimator matrix. vector of future values in CVA matrix, vector of past values in CVA a transformation matrix components of SVD a data sample a bootstrap data sample a pdf (probability density function) dynamic system parameters optimum memory of past

estimate of standard error system input noise

x(t) structural response Y system response e statistic, parameter estimate 6. bootstrap replicate of statistic

1. INTRODUCTION AND MOTIVATION

Analysis of structural test data follows the sequence shown in Fgure 1. spectral densities and transfer functions are derived using averages of input and response Fourier spectra. Modal frequencies, dampings, and mode shapes are identified by fitting linear oscillatory models to the observed spectral properties. Because of measurement noise, temporal and sample variation of system parameters, and nonlinearity of real systems, parameter estimates vary from one analysis to the next. It is desirable to establish the relative degree of variation of parameter estimates so that we can gage the accuracy of system characterizations.

Statistical measures of accuracy like bias, standard error, and confidence intervals can be established for some spectral estimators like autospectral density, cross-spectral density and frequency response function. Other system characteristics like system mode frequencies, dampings and mode shapes are more diffiiult to analyze because the link between the underlying data and the modal measure of interest is quite complex. Some of the spectral estimators that can be analyzed using classical approaches are described in Bendat and Piersol (1 986) and Wirsching, Paez, and Ortiz (1995). These statistics are based on the assumptions of stationary, Gaussian data, and on rather complex statistical derivations.

In this paper we explore the accuracy of modal parameter estimates from another viewpoint. We utilize a recently derived statistical technique called the bootstrap (Efron. 1979) to directly estimate the error bounds associated with the computation of modal frequencies. A recent paper (Hunter and Paez, 1995) shows how the bootstrap can be used to perform statistical analysis on spectral densities and transfer functions from test data. Assumptions of stationarity. linearity, or Gaussian'w are not required. We demonstrate the technique on the modal parameters of data from a multidegree-of-freedom structure.

Page 2: OF MODAL PARAMETERS USING THE BOOTSTRAP in Bendat and Piersol (1 986) ... measure data from a random source and assume that the ... Probability density function of a random source

Measurement of Excitation Computation of Spectral

Response Functions, Canonical Variate Models

and Response Time Series 4 Densities, Frequency I

Figure 1. Analysis of structural test data.

System Identification Estimates,

Damping Values, Mode Shapes 4 Resonant Frequencies,

2. THE BOOTSTRAP PROCEDURE

The objective of bootstrap analysis is to assess the accuracy of parameter estimates that are statistics of measured data by estimating standard error, confidence intervals, and bias. To perform a bootstrap analysis, we measure data from a random source and assume that the observed data represent the source. The source is assumed to generate realizations with an unknown probabifi distribution. Each observed data point is assigned a probabili of Occurrence of l/n, where n is the total number of data points measured. A bootstrap sample of the data is created by selecting at random, with replacement, n elements from the measured data set. This process is illustrated in Figure 2. The procedure is readily implemented using a uniform random number generator which selects, with equal probability, integer values in the range 1 to n. Sampling is done with replacement, so each bootstrap sample may have several occurrences of some data values and other data values may be absent.

F + X = ( X I , X ~ ,..., x,) Creation of bootstrap sample accomplished through random selection among elements of X .

L F+ x' = (x* ,x7 ...., xq)

Figure 2. Obtaining a bootstrap sample.

In a bootstrap analysis, numerous bootstrap samples are created. The statistic of interest is computed from each bootstrap sample; the resulting quantities are known as bootstrap replicates of the statistic of interest. Standard error, confidence intervals, and bias of the statistic of interest are computed using standard techniques and formulas on the bootstrap replicates of the statistic of interest. For example, let 5 denote the number of bootstrap samples used in an analysis, and let&b= l,..., B, denote the bootstrap replicates of the statistic of interest. Then the standard error of the statistic of interest is estimated with

the bootstrap replicates of the statistic of interest, and identifying (or interpolating) the (a /2) x 10W percentile value and !he (l-a/2)x100% percentile value in the sorted list, and using the identified values as the limits of the confidence interval. Another more advanced method for confidence interval estimation is discussed in Efron and Tibshirani (1 993).

The number of bootstrap samples, B, used in an analysis, ranges from 25 to several thousand. The standard error of a parameter estimate may be computed using 25 to 50 bootstrap samples. Accurate computation of the confidence intervals of an estimated parameter requires analysis of a thousand or more bootstrap samples.

Bootstrap sampling provides an optimal estimate of the probability density function which characterizes the data source given that our knowledge of the source is iimited to the measured data. Computation of a statistic from the bootstrap samples simulates computation of the same statistic on samples drawn from the real world distribution. Properties of the "real world" distribution are estimated in the "bootstrap world" as illustrated in Figure 3.

3. A BOOTSTRAP EXAMPLE

Consider a set of data drawn from a random source with the probability density function illustrated in Figure 4. One hundred data points are generated using a random source with this density.

We assume that the 100 points are characteristic of the source. The mean of the sampled points is 1.3440. Using the bootstrap procedure outlined above, we create 400 bootstrap samples of these 100 points. (Normally, each bootstrap sample contains as many points as are available in the original measured data set.) From each sample we compute the sample mean. The standard deviation of these sample means is 0.0660; this is the standard error of the mean estimate. The theoretical mean of this distribution is 1.3333. The 400 bootstrap replications of the original data also albw computation of confidence intervals on our estimated mean. The 99% confidence intervals on the mean are 1.1471 to 1.5073. The true mean lies well within these intervals. To further illustrate the typical bootstrap results, Table I shows the results of seven different realizations of the distribution. In each case, the true mean lies well within the confidence intervals indicated. Note that, as expected, a smaller number of points leads to a broader confidence interval.

in one type- of bootstrap analysis, the two-sided, (1 - a) x 100% confidence intervals are obtained by sorting

Page 3: OF MODAL PARAMETERS USING THE BOOTSTRAP in Bendat and Piersol (1 986) ... measure data from a random source and assume that the ... Probability density function of a random source

Unknown Probabiri Observed

F 4 X = ( ~ 1 ~ x 2 ,..., xn)

Bootstrap Replicates

Figure 3. The bootstrap approximation to the real world. The observed distribution is our best estimate of the true distribution. The observed sample is X, and the statistic of interest 6 = s ( X ) can be computed based on this. In the bootstrap world the observed data are used to generate as many bootstrap samples x* as we wish. Each bootstrap sample is used in the formula 6’ = s(X’) to compute a bootstrap replicate of the statistic of interest. The bootstrap replicates are used to analyze the standard error. confidence intervals and bias of the statistical estimator.

Figure 4. Probability density function of a random source.

Table 1. Mean and confidence intervals for different realizations

of the random data source shown in Figure 4.

NPTS Mean Std.Err. Lower Upper True Generated Estimate Mn. Est. 99%C.I. 99”hC.I. Mean

100 1.3440 0.0660 1.1471 1.5073 1.3333 100 1.3571 0.0612 1.1325 1.5250 1.3333 50 1.3274 0.0881 1.1042 1.5677 1.3333 25 1.3250 0.0898 1.0513 1.5287 1.3333

NPTS Mean Std.Err. Lower Upper True Generated Estimate Mn. Est. 9O%C.I. 9o%C.I. Mean

25 1.3761 0.1181 1.1862 1.5898 1.3333 50 1.2419 0.0903 1.1046 1.3949 1.3333 100 1.3607 0.0635 1.2605 1.4706 1.3333

4. APPLICATION OF THE BOOTSTRAP TO THE ANALYSIS OF MODAL PARAMETERS

We showed in the previous section that the bootstrap is a technique for the accuracy analysis of statistical estimators. Among other things, it can be used to estimate standard error, and confidence intervals of statistical estimators. Figures 2 and 3 and the text in Section 2 make it clear that we need to build up an ensemble of bootstrap replicates in order to use the bootstrap technique to analyze variability in a system parameter. As mentioned above, we are interested in doing this for the modal frequencies of a structural system.

There are several techniques for estimating the modal parameters of a system based on measured data. Any of these methods that permits the generation of bootstrap samples from measured data, then the use of the bootstrap samples to compute modal parameters Vhese are the bootstrap replicates.), can be used to analyze the statistical variation in the modal parameters via the bootstrap technique.

To clearly illustrate the analysis of modal parameters using the bootstrap, we chose to numerically simulate the two degree of freedom, base-excited system, shown in Figure 5.

The equations governing its motion are

x 2 + c12(x2 - XI) + k12(X2 - XI) = 0

%l + c12(i1 - X*) + CIO(X1 - i o )

xo + clo(X1 - i o ) + klO(X1 - xo) = 0

(2) + k12(x1 - x2) + kIO(X1 - xo) = 0

where Z0 is a wide band random nojse, and

Page 4: OF MODAL PARAMETERS USING THE BOOTSTRAP in Bendat and Piersol (1 986) ... measure data from a random source and assume that the ... Probability density function of a random source

.

t Fgure 5. Two Degree of Freedom System

The equations descn i a two degree of freedom system. All mass values have been set to unity. The two dynamic resonant frequencies of this system are

f, = 0.6180 HZ 12=1.618OHZ (3)

This system- is simulated using the MatlabTM code SimulinkTM. To simulate a realistic measurement environment and ensure that a measurable difference occurs in the estimated frequency values from each data biock, broad- band Gaussian random noise is added to both XI and .?,. Uncorrelated noise whose rms value is equal to approximately eight percent of the standard deviation of the responses is added to each response. Eght thousand points of each output are computed using a fourth order Runge Kutta algorithm. A cubic spline interpolation insures equal time steps of 0.05 seconds between each pair of data points. Four hundred point blodts are selected from this 8,000 point data set using a uniform random index, which allows the 400 contiguous points composing the block to start at any index between 1 and 7,600. Each 400 point block is a bootstrap sample of the original time series. Twelve hundred of these 400 point blocks are selected and a system identification algorithm used to compute the modal frequencies. The 1200 values of estimated modal frequencies are bootstrap replicates.

6. Computation of the Modal Parameters Using Canonical Variate Analysis (CVA)

Many system identification algorithms are used to identify modal parameters. A robust algorithm is Canonical Variate Analysis (Larimore, 1990). The Canonicai Variate Analysis method for multivariate system identifiition is based on the method of canonical variables originally developed by Hotelling (1936) for static multiiariate statistical analysis. It was originally extended to time series by Akaike (1 975) for a limited class of processes, and for general linear systems by Larimore (1983,1990). CVA produces accurate estimates of system parameters using a relatively small data set. Further, the algorithm is quite robust (Hunter, 1995).

The data are assumed to -me from a linear time invariant multi-input, multiiresponse system. Cdored nonstationary Gaussian noise may contaminate the data, as may biases, trends, or deterministic polynomial fundions of time.

Following is a brief explanation of CVA details can be found in the references. Conceptually, CVA deals directly with the input time series u(t) and the response time series y(t), of the system like Fgure 5. The system characteristics are developed based on measured data that reflect past and future behavior of the system relative to a given time. A time series representing the past p(t) and the future g(t) of the process are formed, where, at a given time to,

where T is the measurement time interval. Combining past and future values at successive times leads to "delay" matrices P and F of the form shown below.

P =

40 -7) u(to -27) ... u(to -tr) y(to -7) y(to -27) ... y(to -4.1 u(tl-T)u(tl-h) ... u( tl-i.)y(rl-~)y(t,-2~) ...y( t l - q ... u(tm-7) u(tm-2.r) ... u(tm-jz) y ( tm-7)y( tm-a) ...y( tm-k7)

(a)

Minimization of the error in predicting the future g(t) from the past p(t) is accomplished using a series of singular value decompositions on P and G, leading to a transformation matrix T which selects from the past the information critical to the prediction of the future. Tis formed from

where

Page 5: OF MODAL PARAMETERS USING THE BOOTSTRAP in Bendat and Piersol (1 986) ... measure data from a random source and assume that the ... Probability density function of a random source

and SM(.) denotes the singular value decomposition. The selection of an optimal =memow of the past is described by

m(t) = TP (7)

Once m(t) is evaluated at every time t, the state equations may be obtained in a least squares sense using

m(i + 1) = Am(i) + Bu(i) + w(i) y(i) = -(i) + Mi) + Ew(i) + v(i) (8)

In Eq. (8) m(i). u(i),and y(r) are known at any time t i , w(i) and v(i) are white noise processes which result from errors in the solution, and A,B,C,D. and E are determined using least squares. The term €w(i) allows for possible correlation between the state noise w(4 and measurement noise v(i). The possibility of this correlation is necessary to obtain a minimal order state space representation of the process. This process is described in detail in Larimore (1 994).

From the fitted state space model, several quantities can be computed including the transfer functions, power spectra, covariance functions, the system step responses, and in particular, the system modal parameters. To identify 1,200 sets of modal parameters for our two-degreesf-freedom system, we successively input the 400 point data blocks derived from a random selection of the indices in the 8,000 point data set to the CVA algorithm. Each data set provides us with one estimate of the two modal frequencies of the system. These 1,200 frequency values are the bootstrap realizations we use to compute the statistics on our estimated modal parameters.

We used the 1200 bootstrap replicates of the modal frequencies to estimate their standard errors and confidence intervals. The standard errors are

1000 I

A

Se(f1) = 7.04 x I o4 A

~ 8 ( f 2 ) = 3.07 x 1 O9

The 95% confidence intervals are

f, : (0.6168,0.6199) f2 : (1.6130,1.6249)

(9)

Figures 6a and 6b are the kernel density estimators (estimators of the probabl i density functions, see Silverman. 1986) of the modal frequencies, based on the bootstrap replicates. They help to clarify the results shown in Eqs. (9) and (1 0).

Y- U n

fl Fgure 6a Kernel density estimator of the first modal frequency based on its bootstrap replicates.

200

150

g100

50

f2 Fgure 6b. Kernel density estimator of the second modal frequency based on its bootstrap replicates.

With the bootstrap replicates in hand, it is also a very easy matter to estimate the joint confidence regions for groups of statistics. This can be done in a parametric, or assumed form, framework, or a nonparametric framework. Consider, for example, a parametric framework. We estimated the 95% joint confidence region for the two modal frequencies. It is shown in Fgure 7. it was obtained by using the bootstrap replicates of the first and second modal frequencies to form the quantity

Page 6: OF MODAL PARAMETERS USING THE BOOTSTRAP in Bendat and Piersol (1 986) ... measure data from a random source and assume that the ... Probability density function of a random source

P

then finding the 95% upper confidence interval for 0;. The f r i , f 2 j . are the bootstrap replicates of modal the frequencies; the ill& are their estimated means; the

se(f,), se(f2). are their standard errors; is their estimated covariance.

The standard error estimates and the confidence intervals developed here cannot be accurately obtained by other methods.

A A

1 . 6 3 ~

1.625

1.62

1.615

1.61

cu w-

1.605 ; 0.61 0.62 0.63

fl Fsure 7. The 95% pint confidence region for the two modal frequencies, based on the parametric model in Eq. (1 1).

7. CONCLUSIONS

The bootstrap approach provides a useful alternative to traditional statistical analysis in numerous situations involving real-world data. We have shown, for a linear system driven by Gaussian noise, that one effective bootstrap technique uses a random sample of measured data to assess the variation in system modal parameters. The bootstrap technique provides an approach for assessing statistical variation that cannot be accurately evaluated using other techniques. The bootstrap does not rely on assumptions of Gaussian distribution in the excitation or response data. The primary difficulty in application of the bootstrap to analysis of statistical accuracy is that it is very computer intensive.

8. ACKNOWLEDGMENT

This work was performed at Los Alamos and Sandia Laboratories and supported by the Department of Energy.

9. REFERENCES

Bendat, J. S. and Piersol, Alan G.(1986), t?an&m Data: Measurement and Analysis Procedures, Second Edition. Wiley Interscience.

Efron, 8. (1979), "Bootstrap Methods: Another Look at the Jackknife," Annals of Statistb, 7. 1-26.

Efron, Bradley and Tibshirani, Robert J.(1993). An Introduction to the Bootstrap, Applied Monographs on Statistics and Applied Probability 57. Chapman and Hall, 1993.

Hotelling, ., (1936), "Relations Between Two Sets of Variables," Biometnka, 28, pp. 321 377.

Hunter, N, (1995), "A Comparison of State Model Estimation Using Canonical Variate Analysis and Eigensystem Realization Analysis," Las Akvnas Nationd Laboratory TechnEcal &port, 95-1 275, Submitted for pubtication.

Hunter, N., Paez, T., (1995). "Application of the Bootstrap to the Analysis of Vibration Test Data," Pmeedings of the 66th Shock and Vibration Symposium, Shock & Vibration Information Analysis Center, Biloxi, Mississippi.

Larimore, W., (1 983). "System Identification, Reduced Order Filtering, and Modeling via Canonical Variate Analysis," Proceedings of the 7 9 8 3 American Control Conference, H. S. Rao, and P. Dorato, eds., pp. 445-451.

Larimore. W.. (1990). "Canonical Variate Analysis for Identification, Filtering, and Adaptive Control," Proceedings 29th IEEE Coderenee on Decision and Contrd, Honolulu, Hawaii. Vol. 1, pp. 635-639.

Larimore, W., ( l a ) , "The Optimality of Canonical Variate Identification by Example," SYSID 94, IFAC Symposium on System Identification and Parameter Estimation, July 1994.

Silverman, B. W.(1986), Density Estimatbn for Statistics and Data Ana&& Chapman and Hall.

Wirsching, P., Paez, T., Ortiz, (1995), Random Vibrations: Theory and Practke, Wiley, New York.

E d c

pi

Akaike, H.,(l975). "Markovian representation of Stochastic Processes by Canonical Variables," SIAM Journal of Control, Vol. 13, pp- 162-1 73.